cloud computing IEEE What's Special? 3 Fraudulent Resource Consumption 14 Prototype
|
|
|
- Tobias Johnson
- 10 years ago
- Views:
Transcription
1 cloud IEEE Prototype A digital magazine in support of the IEEE Cloud Computing Initiative computing What's Special? 3 Fraudulent Resource Consumption 14 Securing the Cloud May/June 2013
2 cloud computing IEEE Technical Cosponsors Securing the Cloud One of the largest problems to be addressed in cloud computing is that of security identifying threats, understanding responses, and evaluating the tradeoffs involved in making this emerging technology secure for common use. This prototype issue collects together articles on this topic as a sample of what to expect with this new strategic publication, first issue to be published in early Guest Editor s Introduction Jon Rokne 3 What s Special About Cloud Security? Peter Mell 6 Toward Accountability in the Cloud Siani Pearson 12 Public Sector Clouds Beginning to Blossom: Efficiency, New Culture Trumping Security Fears Greg Goth 14 The Insecurity of Cloud Utility Models Joseph Idziorek, Mark F. Tannian, and Doug Jacobson 20 The Threat in the Cloud Matthew Green 23 Implementing Effective Controls in a Mobile, Agile, Cloud-Enabled Enterprise Dave Martin Cloud Computing Initiative Steering Committee Steve Diamond (Chair) David Bernstein Nim Cheung Mark Davis Kathy Grise (Program Manager) Michael Lightner Mary Lynne Nielsen Jon Rokne Jennifer Schopf Doug Zuckerman Angela Burgess Robin Baldwin Marian Anderson Executive Director Manager, Editorial Services Evan Butterfield Sandra Brown Director, Products & Services Senior Business Development Manager IEEE Computer Society Staff Senior Advertising Coordinator [email protected]
3 Guest Editor s Introduction Cloud Computing: Transforming Information Technology Jon Rokne University of Calgary The migration of information and processes to the cloud is transforming not only where computing is done but, fundamentally, how it is done. Cloud computing solves many conventional computing problems, including handling peak loads, installing software updates, and utilizing excess computing cycles, but the new technology has also created new challenges in data security, data ownership, transborder data storage, and the training of highly skilled cloud computing professionals. As more in the corporate and academic worlds invest in this technology, IT professionals working environments are also changing dramatically. Taking Initiative Recognizing that cloud computing is poised to be the dominant form of computing in the future, IEEE has funded a Cloud Computing Initiative (CCI) to coordinate its cloud-related activities. To that end, the IEEE CCI has established tracks for cloud computing standards, conferences, publications, and educational materials. The Cloud Computing Initiative portal site ( presents information on all these topics. The CCI publications track is tasked with developing a slate of cloud computing-related periodicals. To date, it has provided seed funding for two publications: IEEE Transactions on Cloud Computing, launched in 2013, and IEEE Cloud Computing magazine, which will be available in early These publications aim to provide a focused home for cloud-related research and feature articles so that cloud researchers can publish their most important work, informing other professionals of new developments in the field. In this Issue To highlight the IEEE CCI s activities and serve as a preliminary announcement of the cloud publications that will 2013 IEEE Published by the IEEE Computer Society IEEE Cloud Computing 1
4 Guest Editor s Introduction An Invitation C onsider this a personal invitation from me, Chris Miyachi, to join the Computer Society s Special Technical Community on Cloud Computing (CS STC CC) that I m chairing. Special Technical Communities are put together by the Computer Society to establish nimble groups to address emerging interests. The CS STC CC is for members and run by members. Our charter is to provide members with accurate, vendor-neutral information that will demystify IT s top cloudrelated concerns (such as ensuring adequate security, framing service-level agreements, impacts on staffing, and enabling rapid scaling up and down). We welcome people new to cloud computing, and we seek out experts in our community from both industry and academia. What makes us different from other blogs and forums on cloud computing? Two things our members and the power of the IEEE and the IEEE Computer Society. We work closely with the IEEE Cloud Computing Initiative, which produces content on cloud computing. We contribute to the CCI social networking sites (Facebook, LinkedIn, and Twitter) and we will provide content for the IEEE Cloud Computing web site ( cloudcomputing.ieee.org). But we need you. We need you to not only join the CS STC CC, which is free to all IEEE Computer Society members, but to volunteer for one of our open positions. We need you to join the conversation on our social networking sites. And we, as a community need to continue to grow together to understand the rapidly changing world of cloud computing. When I first started to work on projects related to cloud computing, I wondered why the concept was taking off now. After all, I can remember 25 years ago when thin clients where going to take over the world. And then they didn t. One of the reasons they didn t was that the price of hard drives reduced dramatically changine the economics of putting data on your own computer. But with the rise of large server farms required to power Amazon and Google, the model of buying virtual server and storage began to make more economic sense. And here we are today. Prices continue to decrease as service increases and competition keeps everyone on their toes. Is cloud computing here to stay or will we be back to personal systems in the future? Open source advocates ( guardian.co.uk/technology/2008/sep/29/cloud.computing. richard.stallman) be lieve that cloud computing will force people to buy into proprietary systems. We as consumers of cloud systems will determine what the future of the cloud will look like. Do we want a common interface to the cloud that will allow us to move freely from one cloud provider to the next? If so we will need to push for that with cloud providers. This kind of conversation is exactly the kind we will be having at the CS CC STC, so join the discussion today at www. computer.org/cc. appear later this year and next, the IEEE Computer Society publications team has created this IEEE Cloud Computing supplement, reprinting cloud computing articles from other IEEE Computer Society magazines. The six articles in this supplement cover a wide range of cloudrelated issues, focused particularly on security topics. We open with IT Professional s What s Special About Cloud Security? in which author Peter Mell claims that developers can address cloud security issues by creatively applying techniques developed for other technologies. In Toward Accountability in the Cloud, Siani Pearson explores cloud computing consumers concerns about data and privacy protection, calling for taking context into account and avoiding one-size-fits-all approaches. Public Sector Clouds Beginning to Blossom, is a news feature in which author Greg Goth explores cloud computing s attraction for the financially strapped public sector. In The Insecurity of Cloud Utility Models, Joseph Idziorek, Mark Tannian, and Doug Jacobson examine an issue that isn t immediately obvious: in the pay-asyou-go cloud billing process, fraudulent consumption by a botnet, for example can lead to significant financial harm for legitimate users. From there, we move on to Matthew Green s discussion of the security risks associated with running cryptographic services in cloud-based virtual machines in The Threat in the Cloud. In our last article, Implementing Effective Controls in a Mobile, Agile, Cloud-Enabled Enterprise, Dave Martin dissects the technical and cultural changes required of IT security teams as businesses increasingly rely on mobile and cloudbased activities. This prototype illustrates the type of articles the IEEE Computer Society is already publishing on cloud computing, and it hints at the reliable, insightful content you can expect to find in IEEE Cloud Computing. For subscription information, be sure to visit www. computer.org/cloudcomputing. The CCI Publications track would like to have broad representation from IEEE societies with interests in cloud computing. If you wish to participate in the ongoing discussion of the publications initiatives, please contact me via . Jon Rokne is the IEEE CCI Publications track chair. He is a professor and former head of the computer science department at the University of Calgary and the past vice president of publications for IEEE. Contact him at [email protected]. 2 IEEE Cloud Computing May/June 2013
5 Securing the Cloud What s Special About Cloud Security? Peter Mell US National Institute of Standards and Technology Although cloud security concerns have consistently ranked as one of the top challenges to cloud adoption, 1 it s not clear what security issues are particular to cloud computing. To approach this question, I attempt to derive cloud security issues from various cloud definitions and a reference architecture. Defining Cloud Computing The European Network and Information Security Agency (ENISA) defines cloud computing as an on-demand service model for IT provision, often based on virtualization and distributed computing technologies. 2 It says that cloud computing architectures have highly abstracted resources, near-instant scalability and flexibility, nearly instantaneous provisioning, shared resources, service on demand, and programmatic management. The US National Institute of Standards and Technology (NIST) has also published a cloud definition, which it has submitted as the US contribution for an international standard. 3 According to NIST, Cloud computing is a model for enabling ubiquitous, convenient, ondemand network access to a shared pool of configurable computing resources (for example, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. The NIST definition lists five essential characteristics of cloud computing: on-demand self-service, broad network access, resource pooling, rapid elasticity or expansion, and measured service. It also lists three service models software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) and four deployment models private, community, public, and hybrid that, together, categorize ways to deliver cloud services. NIST has also published a cloud computing reference architecture. 4 As Figure 1 shows, this architecture outlines the five major roles of cloud consumer, provider, broker, auditor, and carrier. These definitions and reference architecture provide a foundation from which we can begin to analyze cloud security issues. Cloud Security Controls Let s first look at cloud security controls documented within the Cloud Security Alliance (CSA) security control framework, which was informed by both the ENISA and NIST definitions. The CSA guidance, version 2.1, contains 98 different cloud security controls from 13 domains, which aim to help evaluate initial cloud risks and inform security decisions. 5 This body of work would seem to indicate that, based on published cloud definitions, we can identify 98 cloud-specific security controls. However, all 98 controls have been mapped to existing implementationindependent security control frameworks. 6 This includes NIST Special Publication and the International Organization for Standardization Based on this evaluation, these security controls don t seem unique to cloud computing US government and internationally standardized general-purpose security controls cover all known CSA cloud security controls. The US government s Federal Risk and Authorization Management Program (FedRAMP, for cloud computing also uses the NIST cloud definition. 7 Instead of creating new cloud security controls, FedRAMP published a selection of existing general-purpose controls from the NIST Special Publication security control catalog ( staffoffices/fedramp_security_controls _Final.zip). Thus, the FedRAMP controls are also generically applicable. This lack of novel security controls for the cloud might arise from the fact that cloud computing is the convergence of 2013 IEEE Published by the IEEE Computer Society IEEE Cloud Computing 3
6 Securing the Cloud Cloud consumer Cloud orchestration Service layer Cloud provider Cloud service management Cloud broker Cloud auditor Security audit Privacy impact audit PaaS IaaS SaaS Resource abstraction and control layer Physical resource layer Hardware Business support Provisioning/ configuration Portability/ interoperability Security Privacy Service intermediation Service aggregation Service arbitrage Performance audit Facility Cloud carrier Figure 1. NIST cloud computing reference architecture. It outlines five major roles: cloud consumer, provider, broker, auditor, and carrier. many different technology areas, including broadband networks, virtualization, grid computing, service orientation, autonomic systems, and Web 2.0. Each of these underlying technology areas has been independently addressed by existing general-purpose security controls, so it seems logical to assume we can address the composition of these technology areas using these same generalpurpose security controls. However, the cloud paradigm might still present security issues that require a novel application of the set of existing generalpurpose security controls. Evidence for this argument lies in the fact that each CSA cloud security control was mapped to multiple controls from the general-purpose control frameworks. Derivation of Cloud Security Issues To show the existence of these security issues, I list a sampling derived from the initial cloud definitions and reference architecture. Many of the essential cloud characteristics, definitional models, and architectural components suggest cloud security issues. Cloud Brokers This reference architecture actor implies security composition challenges within composed clouds, such as a SaaS built on an IaaS. On-Demand Delivery This cloud characteristic suggests security challenges associated with the business user being able to easily and instantly obtain new computing resources that must be presecured on delivery. Resource Pooling This cloud characteristic guides customers toward a put all your eggs in one basket approach that might let users concentrate security resources on a single basket but that also heightens the need for backup and resiliency solutions. From a cloud customer perspective, this characteristic reveals the possibility that attacks against one customer could inadvertently affect another customer using the same shared resources. Service Models The cloud definition service models reveal challenges with multi tenancy in a resource pooled environment. All service models have data multitenancy, while PaaS and IaaS additionally have processing multitenancy in which user processes might attack each other and the cloud itself. Infrastructure as a Service This service model reveals challenges with using virtualization as a frontline security defense perimeter to protect against malicious cloud users. Broad Network Access This cloud characteristic shifts the security 4 IEEE Cloud Computing May/June 2013
7 model to account for possibly untrustworthy client devices that are fully reliant on the network for service. Measured Service This cloud characteristic reveals the need to measure cloud usage to promote overall cloud availability. The cloud computing paradigm appears to present special security issues that will require research and careful consideration. At this point, however, these issues don t appear to require completely new security controls but instead the creative application of existing security techniques. Acknowledgments Certain products or organizations are identified in this document, but such identification does not imply recommendation by the US National Institute of Standards and Technology (NIST) or other agencies of the US government, nor does it imply that the products or organizations identified are necessarily the best available for the purpose. This article reflects the author s personal opinions not the opinions of the US Department of Commerce or NIST. References 1. IT Cloud Services User Survey, Part 2, IDC Enterprise Panel, Aug. 2008; www. clavister.com/documents/resources/ white-papers/clavister-whp-security-in -the-cloud-gb.pdf. 2. Cloud Computing: Benefits, Risks, and Recommendations for information Security, European Network and Information Security Agency, Nov. 2009; europa.eu/act/rm/files/deliverables/ cloud-computing-risk-assessment/ at_download/fullreport. 3. Final Version of NIST Cloud Computing Definition Published, NIST Tech Beat, 25 Oct. 2011; cloud cfm. 4. F. Liu et al., NIST Cloud Computing Reference Architecture, NIST recommendation, Sept. 2011; gov/twiki-cloud-computing/pub/cloud Computing/ReferenceArchitecture Taxonomy/NIST_SP_ _ -_ pdf. 5. Security Guidance for Critical Areas of Focus in Cloud Computing V2.1, Cloud Security Alliance, Dec. 2009; cloudsecurityalliance.org/wp-content/ uploads/2011/07/csaguide.v2.1.pdf. 6. Cloud Controls Matrix, Version 1.2, Cloud Security Alliance, Aug. 2011; research/initiatives/ccm. 7. S. VanRoekel, Memorandum for Chief Information Officers, Executive Office of the President, 8 Dec. 2011, footnotes 5 and 6; Peter Mell is a computer scientist at the US National Institute of Standards and Technology. His research interests include big data technology, cloud computing, vulnerability databases, and intrusion detection. Contact him at [email protected]. This article originally appeared in IT Professional, July/August 2012; doi.ieeecomputersociety.org/ / MITP Get Involved with the IEEE Cloud Computing Follow us IEEECloudComputing IEEE Cloud Computing Cloud Computing has widespread impact across how we access today s applications, resources, and data. The IEEE Cloud Computing Initiative (CCI) intends to lead the way by collaborating across the interested IEEE societies and groups for a wellcoordinated and cohesive plan in the areas of big data, conferences, education, publications, standards, testbed, and dedicated web portal. Get involved The CCI offers many opportunities to participate, influence, and contribute to this technology. Contact us [email protected] Current opportunities Submit a paper or help organize at one of our conferences. Contribute an article to our new Transactions on Cloud Computing publication. Be a part of the P2302 standards working group for intercloud interoperability and federation. Save the Date Cloud Computing for Emerging Markets (CCEM), October 2013, Bangalore, India (cloudcomputing. ieee.org/ccem) Check out the Cloud Web Portal for the latest information on the CCI s activities. cloudcomputing.ieee.org 5
8 Securing the Cloud Toward Accountability in the Cloud Siani Pearson HP Labs Accountability is likely to become a core concept in both the cloud and in new mechanisms that help increase trust in cloud computing. These mechanisms must be applied in an intelligent way, taking context into account and avoiding a one-size-fits-all approach. The US National Institute of Standards and Technology defines cloud computing as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (for example, networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. In short, the cloud offers a huge potential both for efficiency and new business opportunities (especially in service composition), and is almost certain to deeply transform our IT. Not only will cost savings occur due to economies of scale on the service provider side and pay-as-you-go models, but business risk also decreases because organizations have less need to borrow money for upfront investment in infrastructure. However, to help realize these benefits, we must address two primary barriers: lack of consumer trust and the complexity of compliance. Here, I argue that the concept of accountability is key to addressing these issues. Barriers to Cloud Adoption Lack of consumer trust is commonly recognized as a key inhibitor to moving to software-as-a-service (SaaS) cloud models. People have increasing expectations that companies with which they share their data will protect it and handle it responsibly. Furthermore, compared to traditional server architectures, cloud consumers are more concerned about their data s integrity, security, and privacy as focus shifts from server health to data protection. However, current terms of service push risk back on consumers and offer little remediation or assurance. Potential cloud customers perceive a lack of transparency and relatively less control than with traditional models, which is of particular concern in the context of sensitive information. Some cases have arisen in which cloud service providers (CSPs) have been forced by subpoena to hand over data stored in the cloud, and a fear persists that governments might get access to information stored in servers within their countries. Moreover, it isn t clear what would happen if things went wrong. Would providers notify users if a privacy breach occurred? Who would be at fault in such cases? Working out how victims could obtain redress is complex and hard to ascertain. It s also difficult to determine whether data has been properly destroyed (as it should be, for example, in the case of a CSP s bankruptcy or if a customer wishes to switch to a different CSP). So, people are concerned about weak trust relationships along the chain of service provision, especially as regards on-demand models in which users might have to find CSPs quickly; in such cases, trust won t necessarily be transitive along the chain. A second barrier to cloud migration is the difficulty CSPs can have with compliance across geographic boundaries. Dataflows tend to be global and dynamic. Location matters from a legal viewpoint, leading to regulatory complexity. Complying with legislation can be difficult with regard to transborder dataflow requirements and determining which laws apply and which courts should preside. Issues such as unauthorized secondary data usage and inappropriate data retention are also difficult to address. These two issues trust and the complexity of compliance are closely linked. CSPs have both legal and ethical obligations to ensure privacy and protect data and thereby demonstrate their services trustworthy nature. This higher risk to privacy and security in cloud computing is a magnification of issues faced in subcontracting and offshoring. Consumers aren t the only ones worried about privacy and security concerns in the cloud. 1 The European Network and Information Security Agency (ENISA) s cloud computing risk assessment report states loss of governance as a top risk of cloud computing, especially for infrastructure as a service (IaaS). 2 Data loss or leakages is also one 6 IEEE Cloud Computing Published by the IEEE Computer Society 2013 IEEE
9 of the top seven threats the Cloud Security Alliance (CSA) lists in its Top Threats to Cloud Computing report. 3 The cloud s autonomic and virtualized aspects can bring new threats, such as cross-vm (virtual machine) side-channel attacks, or vulnerabilities due to data proliferation, dynamic provisioning, the difficulty in identifying physical servers location, or a lack of standardization. Although service composition is easier in cloud computing, some services might have a malicious source. All these privacy and security risks might actually decrease, however, if users move from a traditional IT model to a cloud model with CSPs who have expertise in privacy and security. Accountability can help us tackle these challenges in trust and complexity. It s especially helpful for protecting sensitive or confidential information, enhancing consumer trust, clarifying the legal situation in cloud computing, and facilitating crossborder data transfers. My focus here is on data-protection issues in the cloud. The term data protection has more of a privacy focus in Europe but a broader data security context in the US. I focus primarily on privacy, but some of these issues transcend personal data handling and generalize to other types of data, beyond privacy concerns. What Is Accountability? For several years, computer science has used the term accountability to refer to a narrow and imprecise requirement that s met by reporting and auditing mechanisms. Here, however, I use the term in the context of corporate data governance. Accountability (for complying with measures that give effect to practices articulated in given guidelines) has been present in many core frameworks for privacy protection, most notably the Organization for Economic Cooperation and Development (OECD) s privacy guidelines (1980), 4 Canada s Personal Information Protection and Electronic Documents Act (2000), 5 and Asia Pacific Economic Cooperation (APEC) s Privacy Framework (2005). 6 More recently, region block governance models are evolving to incorporate accountability and responsible information use, and regulators are increasingly requiring that companies prove they re accountable. In particular, legislative authorities are developing frameworks such as the EU s Binding Corporate Rules (BCRs) and APEC s Cross Border Privacy Rules to provide a cohesive and more practical approach to data protection across disparate regulatory systems. For example, BCRs require that organizations demonstrate that they are, and will be, compliant with requirements that EU Data Protection Authorities (DPAs) have defined for transferring data outside the EU. More recently, several groups have highlighted accountability s significance and utility in introducing innovations to the current legal framework in response to globalization and new technologies (see The Future of Privacy, from the Article 29 Working Party, 7 its opinion of July 2010, 8 and the Madrid Resolution s global data protection standards, which the International Conference of Data Protection and Privacy Commissioners adopted in October 2009). The Galway project started by privacy regulators and privacy professionals defines accountability in the context of these latest regulations: Accountability is the obligation to act as a responsible steward of the personal information of others, to take responsibility for the protection and appropriate use of that information beyond mere legal requirements, and to be accountable for any misuse of that information. 9 Central components of this notion are transparency, responsibility, assurance, and remediation. With regard to responsibility, organizations must demonstrate that they ve acknowledged and assumed responsibility, in terms of both having appropriate policies and procedures in place and promoting good practices that include correction and remediation for failure and misconduct. Such organizations must employ responsible decision making and, in particular, report, explain, and be answerable for the consequences of decisions they ve made with regard to data protection. Retrospective vs. Prospective Accountability Some have argued that to provide accountability, we must shift from hiding information to ensuring that only appropriate uses occur. 10 Information usage should be transparent so that we can determine whether a use is appropriate under a given set of rules. CSPs can maintain a history of data manipulation and inferences (providing transparency) that can then be checked against the policies that govern them. This provides retrospective accountability that is, if actor A performs action B, then we can review B against a predetermined policy to decide if A has done something wrong and so hold A accountable. We must extend this approach to include prospective effects because, for instance, the environment might change for instance, new risks might arise for data subjects because the service provisioning chain alters, the location of the physical servers storing or processing data changes, a CSP has new ownership, or a new type of attack occurs. Reducing the risk of disproportionate harm to data subjects thereby reduces negative consequences for data controllers. To do this, we must build in processes and reinforce good practices such that liability doesn t arise in the first place. 11 This is a reflexive privacy process that isn t static, and in which the data controller must conduct an ongoing assessment of harm and a privacy review process throughout the contractual or service provision chain. Broadly speaking, an accountability approach in accordance with current regulatory thinking requires organizations to commit to accountability and establish policies consistent with recognized external criteria; provide transparency and mechanisms for individual participation, including sharing these policies with stakeholders and soliciting feedback; use mechanisms to implement these policies, including clear documentation and communication (encompassing an organization s ethical code), support from all levels within the organizational structure, tools, training, education, ongoing analysis, and updating; allow validation that is, provide means for external enforcement, monitoring, and auditing; and provide mechanisms for remediation, 7
10 Securing the Cloud which should include event management (such as dealing with data breaches) and complaint handling. I argue that we can extend the third item in this list to encompass both preemptive approaches (to assess risk and avoid privacy harm) and reactive approaches that provide transparency and auditing. These privacy policies and mechanisms must take into account the entire life cycle of personal data usage, including deletion. Companies must think about not only what data they ll collect and how they plan to use it but also what potential harm the proposed use of that data could cause to individuals. Without going into the intricacies of legal ownership, the data subject is normally, in a fundamental sense, the real owner of his or her data and is ultimately the person harmed in the event of a privacy breach; this person should be empowered and supported. For example, if you re tracking someone s behavior online, under an accountability approach you might provide clear notice that tracking is happening, an explanation of how you plan to use the data, and a mechanism for individuals to opt out of tracking and request that you delete previous tracking data about them. Data Stewardship A closely related notion to accountability is data stewardship. 12 In a cloud model, many different cloud providers in an ecosystem consume IT. Understanding such ecosystems can be challenging, and we must make a paradigm shift in our thinking. Security and privacy management evolves into an information stewardship problem that is, how organizations can properly look after and protect information (in a broader sense than just personal data) on behalf of the data owners, subjects and third parties. In the cloud, establishing risks and obligations, implementing appropriate operational responses, and dealing with regulatory requirements will be more difficult than with traditional server architectures. The notions of transparency and assurance are more relevant and data controllers and CSPs must ensure chains of accountability. Accountability places a legal responsibility on an organization that uses personal information to ensure that the contracted partners to whom it supplies this data are compliant, wherever they might reside worldwide. So, the communities responsible for data stewardship (who are typically organizational IT security, legal, operations, and compliance staff) place responsibilities and constraints on other individuals or on how systems operate, and these constraints are met along the chain of provision. Intelligent Accountability Baroness O Neill first proposed the idea of intelligent accountability as a means Organizations must employ responsible decision making and report, explain, and be answerable for decisions they ve made. to provide greater accountability without damaging professional performance in her 2002 Reith Lectures on A Question of Trust ( reith2002/). She argued that much of what individuals and organizations must account for isn t easily measured and can t be reduced to a set of stock performance indicators. O Neill said that intelligent accountability requires more attention to good governance and fewer fantasies about total control and that good governance is possible only if institutions are allowed some margin for self-governance of a form appropriate to their particular tasks. We must introduce accountability in an intelligent way, or trust won t increase, and the overall effect could be quite negative with regard to the increased administrative burden. As relates to the cloud, intelligent accountability could involve moving away from box checking and static privacy mechanisms; assessing potential harms to data subjects before exposing data to risks; this would be part of ongoing risk assessment and mitigation, for which privacy impact assessments (PIAs) are one important tool; allowing organizations more flexibility in how they provide data protection so that they can use internal mechanisms and controls that make the most sense for their business situation, rather than a onesize-fits-all prescriptive set of rules; employing various degrees of accountability; it might be that more stringent standards and tests for accountability could facilitate proof of CSPs readiness to engage in certain activities (such as those that involve processing highly sensitive data) or even relieve them of certain administrative burdens (such as renotification of minor changes in processing); and developing clever, automated analysis, automated internal policy enforcement, and other technologies to enhance enforcement and avoid increasing the human burden. As an integral part of an intelligent accountability approach, organizations will need to spend time and resources analyzing what it means to them and gaining management support for implementing necessary changes. How to Provide Accountability in the Cloud Accountability promotes the implementation of practical mechanisms whereby legal requirements and guidance are translated into effective data protection. Legislation and policies tend to apply at the data level, but mechanisms for accountability can exist at various levels, including system and data levels. Solution builders could provide data controllers with a toolbox of measures to enable the construction of custom-built solutions whereby controllers could tailor measures to their context (taking into account the systems involved, the type of data, dataflows, and so on). We can codesign legal mechanisms, procedures, and technical measures to support this approach. We might integrate design elements to support prospective (and proactive) accountability, using preventive controls and retrospective (and reactive) accountability, using detective controls. Preventive controls can help miti gate whether an action continues or takes places at all (for example, an access list that governs 8 IEEE Cloud Computing May/June 2013
11 who can read or modify a file or database, or network and host firewalls that block all but allowable activity). The cloud is a special example of how businesses must assess and manage risk better. 13 Preventive controls for the cloud include risk analysis and decision support tools, policy enforcement (for example, machine-readable policies, privacyenhanced access control, and obligations), trust assessment, obfuscation techniques, and identity management. Organizations can use detective controls to identify privacy or security risks that go against policies and procedures (for example, intrusion-detection systems, policy-aware transaction logs, language frameworks, and reasoning tools). Detective controls for the cloud include auditing, tracking, reporting, and monitoring. In addition, corrective controls are necessary (such as an incident management plan or dispute resolution) that can help fix an undesired outcome that s already occurred. These controls complement each other: a combination would ideally be required for accountability. Provision of accountability wouldn t occur only via procedural means, especially for the cloud, which is an automated and dynamic environment: technology can play an important role in enhancing solutions by enforcing policies and providing decision support, assurance, security, and so on. Procedural measures for accountability include determining CSPs capabilities before selecting one, negotiating contracts and service-level agreements (SLAs), restricting the transfer of confidential data to CSPs, and buying insurance. Organizations should also appoint a data-protection officer, regularly perform privacy impact assessments on new products and services, and put mechanisms in place to allow quick response to data subject access and deletion requests. Technical measures for accountability can include encryption for data security mitigation, privacy infomediaries, and agents to help increase trust. We must also be able to rely on infrastructure to maintain appropriate separations, enforce policies, and report information accurately. At HP Labs, we re investigating how to build and exploit trusted virtualized platforms with precisely these properties. Another mechanism we re researching is the use of sticky policies, in which machinereadable policies (defining allowed usage and associated obligations) are attached to data within the cloud and travel with it. Other mechanisms include risk assessment, decision support, obfuscation in the cloud, and policy translation from higher-level policies to machine-readable ones that are enforced and audited. We don t have the space here to describe all this work, so I ll just briefly outline three examples of our research. First, we ve worked with the HP Privacy Office to develop and deploy a tool called Accountability places a legal responsibility on an organization to ensure that the contracted partners to whom it supplies data are compliant. the HP Privacy Advisor that takes employees through a series of dynamically generated contextual questions and outputs the risk for privacy compliance in any new product, service, or program. It encodes HP s privacy rulebook and other sources and provides privacy by design guidance. An associated workflow with privacy managers ensures that employees address the suggested actions mitigating these risks. The Cloud Stewardship Economics project is defining mathematical and economic models of the cloud ecosystem and the different choices cloud stakeholders face. The goal is to help cloud consumers, providers, regulators, and other stakeholders explore and predict the consequences of different policies, assurance mechanisms, or even ways of regulating accountability. This can facilitate consumer choice; as chains of providers become more complex, the models can highlight how and why evidence sharing is likely to provide necessary assurance. Finally, we re working to achieve accountability using contractual assurances along the service provision chain from CSPs to accountable organizations, enhanced on the technical side by enforcement of corresponding machine-readable policies propagated with (references to) data through the cloud, integrated risk assessment, assurance, and auditing. By these means, the accountable organizations can ensure that all who process data observe their obligations to protect it, irrespective of where that processing occurs. Moving Forward Current regulatory structure places too much emphasis on recovering and not enough on trying to get organizations to proactively reduce privacy and security risks. New data governance models for accountability can provide a basis for providing data protection when people use cloud computing. Accountability is becoming more integrated into our self-regulatory programs as well as future privacy and data protection frameworks globally. If CSPs don t think beyond mere compliance and demonstrate a capacity for accountability, regulations will likely develop that could be difficult to follow and might stifle innovation; a backlash might also arise from data subjects. Strengthening an accountability approach and making it more workable by developing intelligent ways to apply accountability and information stewardship is a growing challenge. It goes beyond traditional approaches to protect data (such as security and the avoidance of liability) in that it includes complying with and upholding values and obligations, and enhancing trust. Hewlett-Packard is actively working in this area to produce practical solutions, both on the policy (HP Privacy Office) and technical fronts (HP Labs). At present we re just starting to see some technical work emerging from other parties in this area. The CSA a non-profit organization formed to promote the use of best practices for providing security assurance within cloud computing has a Governance, Risk Management, and Compliance (GRC) stack that includes two very relevant activities: CloudAudit, which aims to provide a technical foundation to enable transparency and trust in private and public cloud systems, and the Trusted Cloud Initiative, which is working toward certifying trusted clouds. HyTrust Appliance is a hypervisor consolidated log report and policy-enforcement tool that logs from a system perspective. The Commonwealth Scientific 9
12 Securing the Cloud and Industrial Research Organization (CSIRO) has produced a prototype in which CSPs are accountable for faulty services. The Computer Sciences Corporation (CSC) is developing a CloudTrust protocol that will promote CSP transparency. At HP Labs, our broader vision is to deliver seamless, secure, contextaware experiences for a connected world. The richness, choice, and convenience of how we interact with our devices and a pervasive computing environment will be enhanced. At the same time, we want this to be safe and ultimately controlled by end users. We ve been introducing and will continue to research new innovative techniques to uphold HP s ethics and values internally and demonstrate this to our stakeholders and customers. References 1. R. Gellman, Privacy in the Clouds: Risks to Privacy and Confidentiality from Cloud Computing, World Privacy Forum, 2009; _Cloud_Privacy_Report.pdf. 2. Cloud Computing: Benefits, Risks and Recommendations for Information Security, D. Catteddu and G. Hogben, eds., ENISA, Nov. 2009; rm/files/deliverables/cloud-computing -risk-assessment/at_download/full Report. 3. Top Threats to Cloud Computing, version 1.0, tech. report, Cloud Security Alliance, Mar. 2010; org/topthreats/csathreats.v1.0.pdf. 4. Guidelines Governing the Protection of Privacy and Transborder Flow of Personal Data, Organization for Economic Cooperation and Development (OECD), Personal Information Protection and Electronic Documents Act (PIPEDA), Canada, schedule 1, principle 1, APEC Privacy Framework, Asia-Pacific Economic Cooperation, 2005; IEEE Computer Society Offers Cloud Computing Course Series As part of its mission to support the needs of those in the computing industry, the IEEE Computer Society has developed a series professional development courses on Cloud Computing. These products have been developed by IEEE-CS staff as well as a large number of subject matter experts chosen from Society membership and other authoritative sources. These courses are part of the Computer Society s Specialty Course Series, and will include an overview and concept course, in-depth courses, and various other products, addressing the essential concepts and elements of the Cloud from the perspective of the business and IT decision-maker. Managers are often faced with having to decide if, and how to upgrade their IT infrastructure, and how to pay for it. In an environment of tight budgets and soaring hardware and software costs, they are also looking for alternatives to making huge investments that will have to be upgraded again and again. The Cloud can be that solution. Managers need information to make intelligent decisions however. Questions pertaining to Cloud economics, security, regulation and governance, metrics and migration are introduced and discussed in the Cloud Computing course series. In the final analysis, managers must be able to answer key questions is the Cloud the right place for my IT infrastructure and data? Is it a good business decision? How do I migrate to the Cloud? This course series examines these and other key concepts. Learn more about this exciting new Cloud course series contact Dorian McClenahan at the Certification and Professional Education Group at [email protected]. org/groups/committee-on-trade-and -Investment/~/media/Files/Groups/ ECSG/05_ecsg_privacyframewk.ashx. 7. The Future of Privacy: Joint Contribution to the Consultation of the European Commission on the Legal Framework for the Fundamental Right to Protection of Personal Data, EU Article 29 Working Party, WP168, Dec. 2009; ec.auropa.eu/justice/policies/privacy/ docs/wpdocs/2009/wp168_en.pdf. 8. Opinion 3/2010 on the Principle of Accountability, EU Article 29 Working Party, WP173, July 2010; ec.europa.eu/justice/policies/privacy/ docs/wpdocs/2010/wp173_en.pdf. 9. Galway Project Plenary Session Introduction, Galway Project, 28 Apr. 2009, p D. Weitzner et al., Information Accountability, Comm. ACM, vol. 51, no. 6, 2008, pp S. Pearson and A. Charlesworth, Accountability as a Way Forward for Privacy Protection in the Cloud, Proc. 1st Int l Conf. Cloud Computing, LNCS 5931, M.G. Jaatun, G. Zhao, and C. Rong, eds., 2009, pp D. Pym and M. Sadler, Information Stewardship in Cloud Computing, Int l J. Service Science, Management, Engineering and Technology, vol. 1, no. 1, 2010, pp A. Baldwin and S. Shiu, Managing Digital Risk: Trends, Issues, and Implications for Business, tech. report, Lloyds 360 Risk Insight, Siani Pearson is a senior researcher in the Cloud and Security Research Lab at HP Labs Bristol. Her current research focus is on privacy-enhancing technologies, accountability, and the cloud. Pearson has a PhD in artificial intelligence from the University of Edinburgh. She s a technical lead on regulatory compliance projects with the HP Privacy Office and HP Enterprise Services and on the collaborative TSB-funded Ensuring Consent and Revocation project. Contact her at [email protected]. This article originally appeared in IEEE Internet Computing, July/August 2011; org/ /mic IEEE Cloud Computing May/June 2013
13 SUBMIT NOW IEEE TRANSACTIONS ON Cloud Computing The IEEE Transactions on Cloud Computing will publish peer reviewed articles that provide innovative research ideas and applications results in all areas relating to cloud computing. Topics relating to novel theory, algorithms, performance analyses and applications of techniques relating to all areas of cloud computing will be considered for the transactions. The transactions will consider submissions specifically in the areas of cloud security, trade-offs between privacy and utility of cloud, cloud standards, the architecture of cloud computing, cloud development tools, cloud software, cloud backup and recovery, cloud interoperability, cloud applications management, cloud data analytics, cloud communications protocols, mobile cloud, liability issues for data loss on clouds, data integration on clouds, big data on clouds, cloud education, cloud skill sets, cloud energy consumption, cloud applications in commerce, education and industry. This title will also consider submissions on Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS), and Business Process as a Service (BPaaS). TCC EDITor-In-CHIEF Rajkumar Buyya Director, Cloud Computing and Distributed Systems (CLOUDS) Lab, The University of Melbourne TCC Steering Committee Members IEEE CoMpuTEr SoCIETy Jon Rokne (SC Chair) Tom Conte Irena Bojanova Dejan Milojicic IEEE CoMMunICaTIonS SoCIETy Vijar Bhargave Vincent Chan IEEE SySTEMS CounCIl SoCIETy Paolo Carbone IEEE power & EnErgy SoCIETy Jie Li Badrul Chowdhury IEEE ConSuMEr ElECTronICS SoCIETy Stu Lipoff For more information please visit:
14 Securing the Cloud Public Sector Clouds Beginning to Blossom Efficiency, New Culture Trumping Security Fears Greg Goth As governments around the world continue to grapple with sluggish economies, cloud computing is emerging as a possible answer to demands for reducing public sector spending. Cloud computing is starting to emerge as a technology that s proven, says Neil McEvoy, president of the Toronto-based Cloud Best Practices Network. Government, McEvoy says, is an ideal context for this new technology and the value it can bring. It s designed to allow multiple organizations to consolidate different levels of the technology stack in a manner that helps them drive much more efficient use of infrastructure. If you look at the utilization of IT traditionally, huge amounts of servers are racked up for peak usage. For all other times, they are horribly underused. That s simply not an effective use of taxpayers money in a time of economic crunch. A bird s eye view of public sector cloud computing might actually lead to an incomplete conclusion: taken as government-wide initiatives, many cloud strategies seem to be stalled in political maneuvering or concerns about intruders whether they re agents of other governments or independent hackers gaining access to sensitive areas of government networks. However, numerous agency-by-agency cloud solutions have either already been implemented or are about to launch. And a wide array of supporting organizations, including governmental agencies such as the US National Institute of Standards and Technology (NIST) and private sector organizations such as the Tech America Foundation, are creating an ecosystem of public sector cloud architectural requirements and best practices that correlate with new cloud grid installations. Setting the Standards NIST issued two documents in February 2011 that are widely considered the keystone documents for cloud architecture definitions ( and cloud security ( These documents, McEvoy says, are becoming the de facto standards for governments worldwide. NIST has grown to become the root authority for the cloud computing industry globally, he says. The NIST document defines cloud at a high level, and then more specifically, the detailed recommendations in areas like information security are very actively followed. In fact, the core cloud computing initiative for the government of Canada is what they call the Government Community Cloud, and that s based on the NIST model of the same name. Then, on top of the NIST-compatible architecture, they are layering their expertise on more Canadian-specific requirements. Jennifer Kerber, vice president for homeland security and federal civilian policy at the TechAmerica Foundation, says network architects and administrators in government agencies can see incentives on two fronts encouraging more cloud adoption. In the initial big push for cloud computing within the federal government, Kerber says, many network administrators and managers were unsure about the cloud s benefits. But when you look at the US government and private sector markets and see the difference in efficiency gains in the private sector over government in a 10-, 20-, and 30-year period and realize a lot of that is through embracing technological innovation in the current fiscal environment, it s a natural [direction] for the federal government to look. Former federal CIO Vivek Kundra, who spearheaded the Obama administration s cloud first policy, recently answered a New York Times news story that quoted officials in the Defense and State departments who were wary of security issues in the cloud with an opinion piece in the publication at the end of August. Some agencies, like the General Services Administration, have embraced cloud computing; the agency has cut the IT costs on things as simple as its system by over 50 percent, Kundra wrote. But other agencies have balked. The State Department, for instance, has raised concerns about whether the cloud approach introduces security risks, since data is stored off site by private contractors. 12 IEEE Cloud Computing Published by the IEEE Computer Society 2013 IEEE
15 But cloud computing is often far more secure than traditional computing, because companies like Google and Amazon can attract and retain cybersecurity personnel of a higher quality than many governmental agencies. Government employees are so accustomed to using cloud services like Dropbox and Gmail in their personal lives that, even if their agencies don t formally permit cloud computing, they use it for work purposes anyway, creating a shadow IT that leads to a more vulnerable organization than would a properly overseen cloud computing system. Federal research agencies are initiating cloud security programs, such as DARPA s CRASH (Clean-slate Redesign of Resilient, Adaptive, Secure Hosts) and Mission- Oriented Resilient Clouds, but one cloud computing executive thinks these efforts are a long way from bearing fruit on a wide basis. I see that as a pretty long-term theoretical thing, several years and several millions of dollars away, says Michael Sutton, vice president of security for cloud security vendor Zscaler. I don t think agencies should be waiting for any silver bullet that will let them know, OK, the cloud s now secure enough for me. The approach should be no different than it always has been. You ll always have data of varying levels of classification and risk, and you have to look at those, decide what is appropriate for the cloud today and what is not, and move in an appropriate fashion. A Common Path The latest trends on government clouds seem to be following the agency-byagency scenario: while the British government s top-level G-Cloud initiative seems to have stalled out in a change of government, the UK s National Health Service quietly signed an agreement with Zscaler to provide the NHS with its product. The thing that really helped us was, it was just a massive environment, and very disparate, Sutton says. Different hospitals would have their own IT departments, and it was spread like that through the entire country. An offering like Zscaler was very desirable to them because it didn t require deploying new hardware, and they didn t have to deal with certain pieces of hardware and software not working with everything. You could just float everything through the cloud and it would work in all these environments. Sutton says a key factor in governmental cloud adoption will be the method by which private sector cloud vendors and their public-agency counterparts demark their respective responsibilities. There are certainly plenty of existing guidelines as to how things have to operate they have to comply with FISMA [Federal Information Security Management Act], for instance and none of that is going to go away, but now it s harder to define boundaries, he says. Certainly, vendors and the private sector will have to help with that. For example, Sutton says Amazon s discrete federal cloud approach (which gained FISMA approval in September) illustrates the private sector s recognition that government entities could require separate platforms. We have a public cloud, and we recognize that not all agencies are going to be able to adopt that, either because they have unique requirements or are more conservative, he says. So we realize we re going to have to build some private clouds for government, or we re going to have to have hybrid clouds with certain components under their control. In nations with underdeveloped cloud resources, such as Canada, McEvoy believes moving to the cloud can create a fantastic opportunity for the government to enjoy a double whammy of a benefit. If the ministries responsible for economic development and IT could more closely coordinate their cloud strat egies, McEvoy says, they could out source the public sector infrastructure they need while simultaneously bootstrapping several Canadian companies who could go on to expand internationally. Creating New Culture Ideally, governments at national, regional, and local levels should be able to share cloud computing resources, whether they re infrastructure and platforms or, as Mc Evoy believes, applications developed in one jurisdiction that can then be ported to others with similar tasks. However, a pioneering cloud approach coordinated by the US Centers for Disease Control and Prevention is demonstrating that cloud computing can also lead to cultural changes in how public agencies interact with vital public data. The CDC s BioSense program, launched in 2003 as a federal government-housed and controlled platform meant to address bioterrorism concerns, is about to launch a comprehensive redesign a cloud-based data reporting and analysis platform in which multiple public health agencies will share both data and governance. The new BioSense platform, hosted by Amazon Web Services, will be governed by the Association of State and Territorial Health Officials (ASTHO), in coordination with the Council of State and Territorial Epidemiologists (CSTE), the US National Association of County and City Health Officials (NAC- CHO), and the International Society for Disease Surveillance (ISDS). T aha Kass-Hout, the CDC s program manager for the BioSense program, says the new collaborative approach will better mirror how local and regional public health agencies deal with possible outbreaks of disease or attack. Biosurveillance is really about the local context anyway, Kass-Hout says, and in redesigning BioSense, we had to be cognizant not just of legal issues such as data use agreements but also of respecting business logic at the various levels and the best practice procedures they ve instituted. Data should flow from providers to local departments and upward, but it should also flow horizontally. Local and state health departments have the best relationship with providers, they understand the context in which an event has happened, and they understand their population more than anybody else. If we can make sure they have ownership of that data and the initial vetting of it is there, that would be the basis to truly start stitching a regional and national picture. Greg Goth is a freelance technology writer based in Connecticut. This article originally appeared in IEEE Internet Computing, November/December 2011; org/ /mic
16 Securing the Cloud The Insecurity of Cloud Utility Models Joseph Idziorek, Mark F. Tannian, and Doug Jacobson Iowa State University Cloud-based services are vulnerable to attacks that seek to exploit the pay-as-you-go pricing model. A botnet could perform fraudulent resource consumption (FRC) by consuming the bandwidth of Web-based services, thereby increasing the cloud consumer s financial burden. Akey feature that has led to the early adoption of public cloud computing is the utility pricing model, which governs the cost of computing resources consumed. Similar to public utilities, such as gas and electricity, cloud consumers only pay for the resources (storage, bandwidth, and computer hours) they consume and for the time they use such resources. In accordance with the terms of agreement of the cloud service provider (CSP), cloud consumers are responsible for all computational costs incurred in their leased compute environments, regardless of whether the resources were consumed in good faith. Common use cases for corporations that have adopted public cloud computing include website and Web application hosting and e-commerce. Like any Internet-facing presence, these cloud-based services are vulnerable to distributed denial-of- service (DDoS) attacks. Such attacks are well known, and the associated risks have been well researched. Here, we explore a more subtle attack on Web-based services hosted in the cloud. Given the pay-as-you-go pricing, cloud-hosted Web services are vulnerable to attacks that seek to exploit this model. An attacker (for example, a botnet) can perform a fraudulent resource consumption (FRC) attack by consuming the metered bandwidth of Web-based services, increasing the cloud consumer s financial burden. 1,2 In the scenario in Figure 1, a botnet comprising potentially thousands of bot clients is consum ing Web resources hosted in the cloud by mimicking legitimate client behavior. To the cloud-based Web application, the intention of incoming requests is either unknown or not considered, so each request is serviced with a reply, resulting in a fractional cost for the cloud consumer. Because this vulnerability, up until now, hasn t been largely discussed, determining this threat s overall effect on the cloud community is difficult. Rather, we focus here on describing the vulnerability to increase awareness, analyzing the risk for an individual cloud consumer and discussing methods for FRC prevention, detection, attribution, and mitigation. The Utility Model The utility model is attractive to a cloud consumer because the low entry cost removes the burden of major capital expenses. However, although convenient, the utility model isn t without its risks the financial liability for resources consumed is unlimited. CSPs, such as Amazon EC2 and Rackspace, charge US$0.12 per Gbyte (up to 40 Tbytes) and $0.18 per Gbyte, respectively, for outbound data transfers. 3,4 As Figure 1 shows, the cloud consumer (the victim) incurs a cost each time a cloud application (the attack target) services a reply. A high volume of requests can be costly. Malicious use is even more burdensome, because the additional run-up in expenses has no associated business value. As it stands today, CSPs don t monitor cloud consumers applications, so it s up to the cloud consumer to prevent, monitor, and respond to such fraudulent behavior. Fraudulent Resource Consumption To better understand the FRC attack, consider the time-series visualization of a Web server log shown in Figure 2. 1,2 The y-axis depicts the number of requests per second, and as the x-axis shows, the time series covers a two-week period. As is common, the modeled Web server capacity is sufficiently over-provisioned this represents a conservative estimate, given the capacity of CSP Web servers. Superimposed on top of normal Web activity are serviced requests from an FRC attack. As Figure 2 shows, initial attack intensity beyond normal activity is in the nuisance activity region, because the resultant costs are insignificant to the cloud consumer. 14 IEEE Cloud Computing Published by the IEEE Computer Society 2013 IEEE
17 Control Attack clients $ Botmaster (Bots) Cloud consumer $ $ $ $ Legitimate clients $ $ $ $ $ $ $ Internet $ $ $ $ $ $ CSP access point $ $ Cloud-based Web applicaiton Public internet CSP network Figure 1. A cloud network-attack diagram. Botnets can exploit the cloud utility model to perform fraudulent resource consumption (FRC), making consumers incur unexpected costs from dishonest use. However, as malicious activity intensifies beyond this region, the malicious costs to the cloud consumer start to become a matter of concern; this transition point is labeled J1. Malicious activity that exceeds J1 enters into the FRC attack region. Within this region, bounded by J1 and J2, an FRC attack doesn t significantly degrade the Web server s quality of service (QoS). If the attack intensity increases above J2, the request volume will reach a point at which the Web server QoS starts to significantly degrade. At this point, current application-layer DDoS detection and mitigation schemes are effective. 5 An objective of FRC attack mitigation research is to improve detection sensitivity that will push J2 closer to J1, thus narrowing the FRC attack region by detecting attacks that are legitimate transactions but differ in the requestors intent. As shown on the right side of Figure 2, the probability of detecting an FRC attack increases as the attack intensity increases. Although nothing prevents an attacker from exploiting the utility model with an attack intensity in the DDoS attack region, such a blatant action carries a higher risk of detection and ultimately mitigation. Depending on the attack objectives and the cost to the attacker, a modest request intensity within the FRC attack region over an extended duration of time has a higher chance of success, because this is considerably more difficult for a victim to mitigate. Faced with such an attack, current DDoS mitigation schemes, firewalls, and intrusion prevention and detection systems would be rendered ineffective, because individual fraudulent requests are protocol compliant and attack rates don t degrade the Web server QoS. As a result, and given the utility pricing model, the potential for an FRC attack fundamentally changes requirements of Webbased anomaly detection for the cloud. Figure 3 depicts an FRC attack as a slowand-low assault or death by a thousand requests. Unlike short-lived DDoS attacks, the duration of an FRC attack could last weeks or months if not detected. Because resources maliciously consumed are additive to that of normal traffic, the aggregate of legitimate and malicious resource use is reflected in a cloud consumer s monthly bill. Availability in the context of this discussion isn t a binary measure in which the system is nearly incapacitated at the time of the attack. The technical infrastructure of a website hosted in a CSP environment will have no trouble functioning while an FRC attack is underway. Instead, availability is a long-term consideration defined as the cloud consumer s ability to withstand the financial consequences of an FRC attack over a prolonged time period. FRC Risk Adopting the public cloud model brings with it new and old security risks. Here, we focus on the risk introduced by the utility pricing model by discussing the likelihood and effects of an FRC attack. The likelihood of a cloud consumer falling victim to an FRC attack depends on the attacker s skill level, computing capacity, and motivation as well as his or her ability to exploit the utility pricing model. This pricing vulnerability is literally hiding in plain sight, because CSPs openly publish their pricing metrics. From a technical standpoint, all that s necessary for an attacker to exploit this vulnerability is to make standard requests for Web content that the cloud consumer makes publicly available. Although a large botnet is the worst-case threat source, conceivably any Internet-connected device could perform an FRC attack with a Perl script making HTTP GET requests or using the Low-Orbit Ion Cannon an open-source tool that has fueled recent DDoS attacks. 6 As evidenced by the growing number, capacity, and sophistication of both botnets and DDoS attacks, the worst-case threat sources undoubtedly possess the skills and resources to mount a sustained and effective FRC attack. The only real factor preventing an FRC attack is a lack of motivation. Yet similar to those who orchestrate DDoS attacks, the motive of an FRC attacker could range from ego and hacktivism to monetary gain, extortion, revenge, competitive advantage, or economic espionage. 7 If recent history is any guide, those who control botnets could perform an FRC attack to promote a political agenda or support an ideological viewpoint. For the victim, the direct monetary effect of an FRC attack is a function of the average request intensity and attack duration. To 15
18 Securing the Cloud Requests per second DDoS attack region FRC attack 110 J2 J1 90 Nuisance activity 70 Normal activity FRC attack region J Days Probability of detection Figure 2. Malicious-requests behavior. The initial attack intensity (labeled J1) results in insignificant costs for the cloud consumer. However, as malicious activity intensifies beyond this nuisance activity region, the cost to the consumer starts to become a matter of concern. Yet distributed denial-of-service detection schemes aren t effective at this lower intensity level (below J2). enumerate one end of the extreme, a weeklong DDoS attack launched from a 250,000 node botnet in 2011 peaked at 45 Gbps. 8 If the aforementioned attack peak was sustained on a cloud instance at $0.12/Gbyte, the resultant costs would have been $0.675 per second which adds up to $411,264 per week. On the other end of the FRC attack region, consider the website modeled in Figure 2. At an average normal request rate of three requests per second, a 250,000-node botnet could double the data usage costs if each bot client generated just two requests per day. Clearly, given the capacity of modern-day networks and computers, the bot clients in this example could significantly increase their daily request quota and multiply the attack cost by orders of magnitude. However, once a bot client s usage footprint eclipses the expected behavior of legitimate clients, the risk of being identified as malicious greatly increases. Defending Against an FRC Attack Defending against an FRC attack is a significant challenge to the cloud consumer, owing to the atypical and unassuming nature of the attack. As is the case with most attack risks, the cloud consumer has four primary objectives: prevention, detection, attribution, and mitigation. Prevention A common way to prevent the exploitation of a vulnerability is to download and apply a patch for it. However, in the context of this discussion, the bug isn t a software defect but a common business model deployed by CSPs. Until this vulnerability is actually exploited, the cloud business model isn t likely to change. So in lieu of a patch for this vulnerability, there are several, albeit limited, prevention options. The use of authentication on a target website would significantly reduce the amount of exploitable resources, but we don t consider it here because we assume the cloud consumer wants to host public content. Similarly, graphical puzzles (Captcha tests) could be used as a preemptive solution to differentiate humans and zombie computers. However, the use of such a test could be detrimental to the overall goals of a public-facing website, because these types of tests will result in a certain percentage of legitimate clients being unable or unwilling to solve such puzzles. Another option would be for the cloud consumer to work with application and content developers to minimize the resource footprint of common or average requests. Limiting the impact of client requests increases FRC attacker costs and risk of detection. Unfortunately, without a utility model patch, these controls won t thwart a motivated attacker. So with limited prevention capability, the next line of defense is detection. Detection FRC detection aims to identify malicious traffic consumption. Because an FRC attack is subtle, previous application-layer DDoS solutions that focus on high request intensities aren t suitable. 9 Instead, initial FRC-detection approaches focus on behavioral metrics derived from Web server log files that seek to profile the aggregate webpage request choices of a website s client base. 2 Three measures the Spearman, Overlap, and Zipf metrics respectively characterize the accuracy, completeness, and relative proportionality of ranked requests between two adjacent windows of observed logs (for example, two three-day windows). 2 Together, these three metrics provide consistent measures with which to describe normal behavior and perform anomaly detection. However, for the sake of brevity, we don t present empirical results here. The conclusion stemming from this work is that an attacker, without knowledge of the training dataset (historical Web server log), has a difficult time requesting an impactful volume of Web documents while adhering to the structure of normal traffic. Thus our proposed methodology, 2 which focuses on characterizing aggregate Web traffic, is effective for detecting even minor increases in fraudulent Web activity, well before the resultant costs are harmful. The most practical detection approach is the classic review your bills approach. Reviewing bills over time to determine if they re within an expected range can help expose an FRC attack. Log analyzers might also help identify outlier application usage, triggering an investigation of suspicious clients. A casual inspection, however, won t catch a savvy FRC attacker. 16 IEEE Cloud Computing May/June 2013
19 Attribution Attribution in this context is the ability to accurately differentiate legitimate clients from FRC attack clients. Like the previously discussed DDoS detection solutions, current attribution solutions are geared toward detecting malicious clients that consume a significant volume of requests in a very short time. Previous work has focused on scrutinizing the increased inter-request (the time between successive Web document requests) or intersession (the time between Web browsing sessions) arrival request rates of malicious clients in comparison to the rate profile for normal users. 10 Again, it s contrary to FRC attack objectives for a single attack client to behave in a fashion similar to one participating in a DDoS attack. The challenge in this research area will be to minimize the number of falsely identified legitimate clients while decreasing the impact of fraudulent clients. Recent research indicates that normal client behavior can be characterized by client actions such as request volume per client, Web documents requested, and Web session parameters (for example, requests per session and number of sessions). 11 If attack clients that aren t privy to normal usage activity exceed a set threshold on these characteristics, they re flagged as malicious. This attribution methodology aims to be transparent to clients, and it operates under the condition that all clients are innocent until their usage footprint proves otherwise. Limiting the impact of individual clients reduces the overall risk of an FRC attack. It s important to note that this methodology is not rate-based; rather, it s sensitive to the accumulated requests an attacker invokes. Therefore, the choices an attacker makes could allow a malicious client to be deemed anomalous after it invokes a minimal number of requests. Mitigation Reactive solutions rely on accurate detection and attribution. We must consider the potential for legitimate clients being errantly classified as malicious. As a result, approaches like blacklisting first-time offenders might prove heavy-handed. Less absolute mitigation strategies include imposing a back-off timeout to anomalous clients in which requests Actual cost Malicious resource use Figure 3. Aggregation of an FRC attack a slow-and-low assault. Unlike short-lived DDoS attacks, an FRC attack could last weeks or months if not detected. from an IP address aren t all serviced. Similarly, suspicious clients could also be served a graphical puzzle to prove that the client is indeed a human. These reactive approaches are available today, and each has its own tradeoffs. However, with limited detection and attribution solutions available, the deployment and maintenance of such solutions will be challenging. L etting any client with access to the Internet consume resources that are in turn metered and billed exposes the cloud consumer to a risk that s only mitigated by time, detection, and accountability. Until recently, this vulnerability has been neglected. Unless utility models are restructured to remove the vulnerability of an FRC attack, research in detection and attribution is necessary to ensure the longterm sustainability of cloud consumers and remove one more impediment that could dissuade organizations from adopting public cloud computing. To the best of our knowledge, there have been no known public acknowledgements of an FRC attack occurring on the public cloud. However, the absence of such knowledge doesn t confirm that the utility model vulnerability hasn t or won t be exploited. Back in the early 1990s, Internet-facing Aggregate FRC attack cost curve J F M A M J J A S O Billing period Legitimate resource use firewalls were new and thought to be sufficient to secure a connected enterprise. In reality, attacks were occurring, as intrusiondetection systems soon pointed out. Perhaps the utility model has been exploited and, as an IT community, we re presently ill-equipped to detect its presence or identify its culprits. References 1. J. Idziorek and M. Tannian, Exploiting Cloud Utility Models for Profit and Ruin, Proc IEEE 4th Int l Conf. Cloud Computing (Cloud 11), IEEE, 2011, pp J. Idziorek, M. Tannian, and D. Jacobson, Detecting Fraudulent Use of Cloud Resources, Proc. 3rd ACM Workshop on Cloud Computing Security Workshop (CCSW 11), ACM, 2011, pp Amazon EC2 Pricing, Amazon Web Services, 2012; pricing. 4. Cloud Servers Pricing, Rackspace Cloud Servers, 2012; com/cloud/cloud_hosting_products/ servers/pricing. 5. S. Kandula et al., Botz-4-Sale: Surviving Organized DDoS Attacks that Mimic Flash Crowds, Proc. 2nd Conf. Symp. Networked Systems Design & Implementation, Usenix, 2005, pp
20 Securing the Cloud 6. L. Page, Join in the Wikileaks DDoS War from your iphone or ipad, The Register, 10 Dec. 2010; co.uk/2010/12/10/loic_for_iphone. 7. G. Stonebumer, A. Goguen, and A. Feringa, Risk Management Guide for Information Technology Systems, NIST Special Publication , July L. Constantin, Denial-of-Service Attack Are on the Rise, Anti-DDoS Vendors Report, IDG News Service; 7 Feb. 2012; world.com/businesscenter/ article/249438/denialofservice_attacks _are_on_the_rise_antiddos_vendors _report.html. 9. S. Wen et al., Cald: Surviving Various Application-layer DDoS Attacks that Mimic Flash Crowd, Proc th Int l Conf. Network and System Security (NSS 10), IEEE, 2010; pp S. Ranjan et al., DDoS-Shield: DDoSResilient Sc h ed u l i ng to Co u n te r A p p l i c at i o n L ayer Attacks, IEEE/ ACM Trans. Networking, Feb. 2009, pp J. Idziorek, M. Tannian, and D. Jacobson, Attribution of Fraudulent Resource Consumption in the Cloud, Proc IEEE 5th Int l Conf. Cloud Computing (Cloud 12), IEEE, 2012, pp Joseph Idziorek is a PhD candidate in the Department of Computer and Electrical Engineering at Iowa State University. His research interests broadly include anomaly detection and more specifically the detection and attribution of FRC attacks on the cloud utility model. Idziorek received his BS in computer engineering from St. Cloud State University. Contact him at [email protected]. Mark F. Tannian is a PhD candidate in the Department of Computer and Electrical Engineering at Iowa State University. His research interests include user-centered design and information security visualization in addition to cloud computing security. Tannian received his MS in electrical engineering from George Washington University. Contact him at [email protected]. Doug Jacobson is a University Professor in the Department of Computer and Electrical Engineering at Iowa State University, where he serves as the director of the Information Assurance Center. His research interests include Internet-scale event and attack generation environments. Jacobson received his PHD in computer engineering from Iowa State University. Contact him at [email protected]. This article originally appeared in IT Professional, March/April 2013; doi.ieeecomputersociety.org/ / MITP IEEE CLOUD 2013 IEEE 6th International Conference on Cloud Computing June 27 July 2, 2013 Santa Clara Marriott, CA, USA Change we are leading is the theme of CLOUD Cloud computing has become a scalable services consumption and delivery platform in the fi eld of services computing. The technical foundations of cloud computing include service-oriented architecture (SOA) and virtualizations of hardware and software. The goal of cloud computing is to share resources among the cloud service consumers, cloud partners, and cloud vendors in the cloud value chain. Register today! 18 IEEE Cloud Computing May/June 2013
21 Focus on Your Job Search IEEE Computer Society Jobs helps you easily find a new job in IT, software development, computer engineering, research, programming, architecture, cloud computing, consulting, databases, and many other computer-related areas. New feature: Find jobs recommending or requiring the IEEE CS CSDA or CSDP certifications! Visit to search technical job openings, plus internships, from employers worldwide. The IEEE Computer Society is a partner in the AIP Career Network, a collection of online job sites for scientists, engineers, and computing professionals. Other partners include Physics Today, the American Association of Physicists in Medicine (AAPM), American Association of Physics Teachers (AAPT), American Physical Society (APS), AVS Science and Technology, and the Society of Physics Students (SPS) and Sigma Pi Sigma.
22 Securing the Cloud and the University of Wisconsin demonstrated that you can extract cryptographic keys from one VM to another, even when all the standard cloud security features are in place. 2 This is made possible thanks to side channels pathways that leak sensitive data much the same way a hotel wall leaks sound. The Threat in the Cloud Matthew Green Johns Hopkins University People like to tell us that the cloud is the future. I d love to write this off as hype, but this time the hype is probably accurate. Though a few traditionalists might still choose to run their own balky hardware, the next generation of online services will almost certainly run on somebody else s servers, using somebody else s software. Needless to say, this has major implications for data security. Take the popular photo-sharing site Instagram, for instance. Rather than purchasing or renting servers, Instagram s developers deployed the entire service using rented instances on Amazon s popular EC2 cloud-computing service (EC2 is short for Elastic Compute Cloud; amazon.com/ec2). 1 Although Instagram is hardly a security product, it does manage private user data with cryptographic services such as Secure Sockets Layer (SSL) and Secure Shell. This implies the use of publickey cryptography and the corresponding presence of secret keys, all stored on hardware that Instagram doesn t control. This might not be a problem in a traditional datacenter environment. However, cloud computing platforms often mingle user tasks across shared physical hardware. Most users are blissfully unaware of this mingling because cloud providers carefully isolate individual customers into separate virtual machines (VMs), much the way hotels isolate guests in separate rooms. In theory, a VM should keep nosy users from stealing their neighbors sensitive data. But when it comes to VMs that perform cryptography, some new research tells us that the existing protections might not be sufficient. To this end, a team of researchers from the University of North Carolina, RSA Laboratories, Side Channels Side-channel attacks have played a major role in the history of cryptography. Usually, these attacks occur when a machine leaks details of its internal operation through some unexpected vector for example, computation time or electromagnetic emissions. The cloud environment offers a bonanza of potential side channels because different VMs share physical resources for example, processor, instruction cache, or disk on a single computer. If an attacking program can carefully monitor those resources behavior, it can theoretically determine what another program is doing with them. This threat has long been discussed by cloud security experts but has largely been dismissed by providers. This is because in this area, turning theory into practice turns out to be surprisingly difficult. There are many reasons for this. For one thing, cloud providers often run many different VMs on the same server, which tends to add noise and foil an attacker s careful measurements. The Virtual Machine Manager (VMM) software itself adds more noise, and places a barrier between the attacking user and the bare metal of the server. Moreover, individual VMs are routinely swapped between different cores of a multicore server, which makes it difficult to know what you re actually measuring. All of these factors combine to make side-channel attacks extremely challenging. The New Attack The new research focuses on the Xen VMM, which is the software Amazon uses to run its EC2 service. Although the attack isn t implemented in EC2 itself, it focuses on similar hardware: multicore servers with simultaneous multithreading (SMT) turned off. The threat model assumes that the attacker and victim VM are coresident on the machine and that the victim is decrypting an Elgamal ciphertext using libgcrypt v ( org/software/libgcrypt). 20 IEEE Cloud Computing Published by the IEEE Computer Society 2013 IEEE
23 Elgamal encryption is a great case for side-channel attacks because you implement it by taking a portion of the ciphertext, which we ll call x, and computing x e mod N, where e is the secret key and N is (typically) a prime number. This exponentiation is implemented by a square-and-multiply algorithm that depends fundamentally on the secret key s bits (see Figure 1). If the ith bit of e is 1, steps M (multiply) and R (modular reduce) execute. If that bit is 0, they don t. The key s bits result in a distinctive set of computations that can be detected if the attacking VM can precisely monitor the hardware state. Side-channel attacks employing squareand-multiply have been around for a while. They date back at least to the mid-to-late 1990s, 3 using power and operating time as a channel, and they ve been repeatedly optimized as technology has progressed. More recent attacks have exploited cache misses in a shared instruction cache (typical in hyperthreading environments) as a way for one process to monitor another. 4 However, no one had applied these attacks to the full Xen VM setting. Such an application is challenging for various reasons, including the difficulty of getting the attacking process to run frequently enough to take precise measurements, the problem that virtual CPUs (VCPUs) can be assigned to different cores or that irrelevant VCPUs can be assigned to the same core, and noisy measurements that give only probabilistic answers about which operations occurred on the target process. The task facing these researchers was therefore to overcome all these noise sources and still recover useful information from the attacked VM. Exploiting Cache Misses At a fundamental level, this new attack is similar to previous attacks that worked by measuring the behavior of the shared instruction cache. 4 The attacking VM first primes the L1 instruction cache by allocating continuous memory pages. It then executes a series of instructions to load the cache with cacheline-sized blocks it controls. SquareMult(x, e, N): let e n,, e 1 be the bits of e y 1 for i = n down to 1 { y Square(y) (S) y ModReduce(y, N) (R) if e i = 1 then { y Mult(y, z) (M) y ModReduce(y, N) (R) } } return y Figure 1. The square-and-multiply algorithm. 2 Its operation depends fundamentally on the secret key s bits. Next, the attacker gives up execution and hopes that the target VM will run next on the same core and, moreover, that the target is running the square-and- multiply algorithm. If it is, the target will cause a few cache-line-sized blocks of the attacker s instructions to be evicted from the cache. The key to the attack is that the choice of which blocks are evicted depends highly on the operations the attacker conducts. To see what happened, the attacking VM must recover control as quickly as possible. It then probes to see which blocks have been evicted, by executing the same instructions and timing the results. If a given block has been evicted, execution will result in a cache miss and a measurable delay. By compiling a list of the missing blocks, the attacker gains insight into which instructions might have executed while the target VM was running. A big challenge for the attacker is to regain control quickly. Wait too long, and all kinds of things will happen; the state of the cache won t give any useful information. Normally, Xen doesn t allow VCPUs to rapidly regain control, but exceptions exist: Xen gives high priority to VCPUs that receive an interrupt. The researchers exploited this by running a 2-VCPU VM, in which the second VCPU s only job was to issue interprocessor interrupts to get the first VCPU back in control as quickly as possible. Using this approach, they could get back in the saddle within about 16 microseconds. This is an eternity in processing time, but it s short enough to give useful information. Making Order out of Chaos The problem with the attack described above is that the attacking VM has no control over where in the computation it will jump in. It could get just a small fragment of the squareand-multiply algorithm (which comprises hundreds or thousands of operations). It could jump into the OS kernel. It could even get the wrong VM because VMs can run on any core. Moreover, the data can be pretty noisy. The solution to these problems is what makes the new research so fascinating. First, the researchers didn t just monitor one single execution they assumed that the device was constantly decrypting different ciphertexts, all with the same key. Indeed, this sort of repeated decryption is precisely what happens inside an SSL webserver. Next, the researchers applied machinelearning techniques to identify which of the many possible instruction sequences were associated with particular cache measurements. This required them to train the algorithm on the target hardware, with the target VCPU conducting square, multiply, and modular-reduce calls to build a training model. During the attack, they further processed the data using a hidden Markov model to eliminate errors and bogus measurements that cropped up from noncryptographic processes. Even after all this work, an attacker winds up with thousands of fragments, some containing errors or low-confidence results. These can be compared against each other to reduce errors, then stitched together to 21
24 Securing the Cloud S1: SRSRMRSMRSRSRSMR S2: MRSRSRSRMR**SRMRSR S3: SRMRSRSR S4: MRSRSRSR**SRMRSR S5: MR*RSRMRSRMRSR S6: MRSRSRMRSRSRSRMR SRSRMRSRMRSRSRSMRSRSRMRSRSRSRMRSRMRSRSRMRSRMRSR Figure 2. Reconstructing six fragments to form a single spanning sequence. This process can recover the private key. In the fragments and recreated sequence, M, R, and S stand for multiplication, modular-reduce, and square calls. Bold letters indicate overlapping instruction sequences. recover the secret key itself. This problem has been solved in many other domains (most famously, DNA sequencing); the techniques used here are quite similar. Figure 2 illustrates this process using an invented example that reconstructs six fragments to form a single spanning sequence. This is a huge simplification of a very neat and complex process that s well described in the research paper. The Outcome With everything in place, the researchers attacked a 4,096-bit Elgamal public key, which (owing to an optimization in libgcrypt) had a 457-bit private key e. After several hours of data collection, they obtained about 1,000 key-related fragments, of which 330 were long enough to be useful for key reconstruction. These let the attackers reconstruct the full key with only a few missing bits, which they could guess using brute force. And that, as they say, is the ballgame. What Does This Mean for Cloud Cryptography? Before you start pulling down your cloud VMs, a few points of order. First, there s a reason these researchers conducted their attack with libgcrypt and Elgamal and not, say, OpenSSL and RSA (which would be a whole lot more useful). That s because libgcrypt s Elgamal implementation is the cryptographic equivalent of a 1984 Stanley lawnmower engine. It uses textbook square-and-multiply with no ugly optimizations to get in the way. OpenSSL RSA decryption, on the other hand, is more like a 2012 Audi turbodiesel. It uses windowing, Chinese remainder theorem, blinding, and two types of multiplication, all of which make these attacks much more challenging. Second, this attack requires perfect conditions. As proposed, it works only with two VMs and, as we mentioned, requires training on the target hardware. This isn t a fundamental objection, especially because real cloud services do use much identical hardware. However, it does mean that messiness the kind you get in real cloud deployments will be more of an obstacle than it was in the research setting. Finally, before you can target a VM, you must get your attack code onto the same hardware as your target. This seems like a pretty big challenge. Unfortunately, some slightly older research indicates that this is feasible in existing cloud deployments. 5 In fact, for only a few dollars, researchers were able to colocate themselves with a given target VM with about 40 percent probability. 6 I n the short term, you certainly shouldn t panic about this, especially given how elaborate the attack is. But this new research does indicate that we should be thinking hard about side-channel attacks and how to harden our cloud platforms to deal with them. References 1. What Powers Instagram: Hundreds of Instances, Dozens of Technologies, Instagram, 2012; -powers-instagram-hundreds-of -instances-dozens-of. 2. Y. Zhang et al., Cross-VM Side Channels and Their Use to Extract Private Keys, Proc. 19th ACM Conf. Computer and Communications Security (CCS 12), ACM, 2012, pp P.C. Kocher, Timing Attacks on Implementations of Diffie-Hellman, RSA, DSS, and Other Systems, Proc. 16th Ann. Int l Cryptology Conf. Advances in Cryptology (Crypto 96), Springer, 1996, pp ; 4. C. Percival, Cache Missing for Fun and Profit, 2005; edu/6.858/2012/readings/ht-cache.pdf. 5. T. Ristenpart et al., Cross-VM Vulnerabilities in Cloud Computing, presentation at 29th Int l Cryptology Conf. (Crypto 09) rump session, 2009; cr.yp.to/8d9cebc9ad358331fcde611bf45f 735d.pdf. 6. T. Ristenpart et al., Hey, You, Get off of My Cloud: Exploring Information Leakage in Third-Party Compute Clouds, Proc. 16th ACM Conf. Computer and Communications Security (CCS 09), ACM, pp Matthew Green is a cryptographer and research professor at Johns Hopkins University s Information Security Institute. Contact him at [email protected]. Illustration by Robert Stack. This article originally appeared in IEEE Security & Privacy, January/February 2013; org/ /msp IEEE Cloud Computing May/June 2013
25 Securing the Cloud Implementing Effective Controls in a Mobile, Agile, Cloud-Enabled Enterprise Dave Martin EMC Several approaches change security clichés into reality by removing the barriers of culture and trust that often prevent the implementation of effective controls. As security professionals, we attend meetings and pronounce that security is everyone s responsibility, and everyone nods in agreement. But in reality, everyone still believes that the security team still has the ball. Another favorite cliché is that security should be built in, not bolted on ; however, finding tangible examples of fully integrated or built-in security is difficult. As security practitioners, it s hard not to blame ourselves for these realities we ve done little to really push these agendas forward. After all, we re all paid paranoids with trust issues that lead us to implement solutions with added layers of controls that don t address root cause, enabling ongoing bad behaviors from the people we don t trust. In this article, I assert that we must break this cycle, build partnerships, implement effective security controls, and improve the long-term effectiveness of our control environments in an ever-complex, agile cloudand mobile-enabled world. A Vision of a Better Future Using layers of security is likely a good idea, but creating layers to mask the true root cause is not. For example, although there are legitimate use cases for Web application firewalls, they re often used to address unknown vulnerabilities in underlying infrastructure and application layers or to provide security log visibility. Other network-based controls, such as network data loss prevention, can be effective, but they add complexity and cost, reduce agility, and introduce additional points of failure to critical operating environments. Vulnerability-scanning services provide critical health data about our environments to address weaknesses in our asset and configuration management systems, but having the application platform and hosts report on the operating environments configuration and patch levels would be more useful. We continue to leverage controls ineffectively, partly by habit and partly because we ve failed to address the fundamental issues by often assuming that processes and infrastructure cannot be made inherently more secure and that they require complex bolted on layers of security. Given the current environment s forcing functions, agility, mobility, virtualization, evolving threats, and cost, we must look for new ways to build our infrastructure and systems. Maintaining existing layers of complexity will involve large amounts of automation, configuration, and change management. Trying to ensure that an application stack remains protected using bolted-on layers of security as it moves from datacenter to datacenter will become a huge challenge. We must plan now, not by eliminating controls (although we should take this opportunity to review them), but by examining the controls we need and how they re applied. We must foster a stronger relationship with the application development teams to ensure they have adequate training on security and threats and follow a solid software development life cycle (SDLC). These teams are vital in integrating our controls directly into the application stack. We must empower them to use integration APIs to many of our common controls and fully embed them into the protected application. For example, where better to deliver data loss prevention than in the application itself? An API call can validate when it s acceptable for the application to transmit data given the context of user, role, device, and so forth. The application is better able to make this decision than, say, a bump 2013 IEEE Published by the IEEE Computer Society IEEE Cloud Computing 23
26 Securing the Cloud in the wire struggling to deal with encrypted data with little or no context. These types of security and risk decisions should be applied in the application business logic layer. Application development teams must also produce intelligent log streams. Traditionally, these logs are used for troubleshooting and debugging, but an external system might be necessary to combine several log events to produce a security event. A better log stream would include more security-relevant details on traditional events and those produced by embedding security controls. The application with its context of business rules, user activity, and a unique sense of data value should produce securitytargeted logs as well as highlight risky transactions. These logs must still be parsed and aggregated to give a full enterprise picture, but they will enable security incident and monitoring teams to detect and respond to detailed application incidents more effectively. In addition, because the control layer is integrated, there are fewer controls to reconfigure if the application moves. Moving to the New Paradigm Moving to this new paradigm will be a journey. We have too many traditional implementations, and beyond that, we need to retrain our IT departments, update our control toolboxes, and address the question of who is responsible for security. First, we must address the foundational components of the enterprise security program, ensuring the SDLC has solid metrics to verify that controls are effective and correctly implemented along with strong quality assurance and testing processes and tools. Development teams should perform these tasks with governance by the information security or risk functions. We should also check that configuration and change management is well executed in the target environment. Vulnerability and misconfiguration should be well managed, and configuration management systems should be monitored in real time, addressing control gaps and vulnerability in a timely manner. With this solid foundation, we should next look to our collection of controls. Many might already have the hooks to be implemented through APIs; this is a good place to begin the transition. Control implementations might require reevaluating other technologies or methods. In addition, we should update log standards to ensure that intelligent logs are being produced and that they re reaching the incident monitoring team with the correct context and response procedures. These technical and process changes are the easy part of the transition. The long-term culture will be harder to address. Security teams are thought of as the implementation point of control processes, and often, these teams don t believe anyone else will implement controls. It s time that we, as security professionals, start challenging ourselves on these assumptions. This will take time and require directed effort: a combination of training IT practitioners on the real threats and the controls that combat them; reimagining controls and how we use them; and improving measurement, governance, and accountability processes. As with any modifications in environments with legacy technology, processes and people won t change overnight. We must act with sponsorship across IT leadership, picking targets to demonstrate the benefits of this approach. By measuring benefits over time and applying these concepts when applications are re-platformed, we can complete the transition, creating an infrastructure that is simpler, more agile, and cheaper and that has more effective integrated controls. Dave Martin is the chief security officer at EMC. His research interests include adaptive controls, cloud risk management, and incident detection and response. Martin received a BEng in manufacturing systems engineering from the University of Hertfordshire in England. He s a Certified Information Systems Security Professional. Contact him at [email protected]. Illustration by Peter Bollinger. This article originally appeared in IEEE Security & Privacy, January/February 2013; org/ /msp PURPOSE: The IEEE Computer Society is the world s largest association of computing professionals and is the leading provider of technical information in the field. Visit our website at OMBUDSMAN: [email protected]. Next Board Meeting: June 2013, Seattle, WA, USA EXECUTIVE COMMITTEE President: David Alan Grier President-Elect: Dejan S. Milojicic; Past President: John W. Walz; VP, Standards Activities: Charlene ( Chuck ) J. Walrad; Secretary: David S. Ebert; Treasurer: Paul K. Joannou; VP, Educational Activities: Jean-Luc Gaudiot; VP, Member & Geographic Activities: Elizabeth L. Burd (2nd VP); VP, Publications: Tom M. Conte (1st VP); VP, Professional Activities: Donald F. Shafer; VP, Technical & Conference Activities: Paul R. Croll; 2013 IEEE Director & Delegate Division VIII: Roger U. Fujii; 2013 IEEE Director & Delegate Division V: James W. Moore; 2013 IEEE Director-Elect & Delegate Division V: Susan K. (Kathy) Land BOARD OF GOVERNORS Term Expiring 2013: Pierre Bourque, Dennis J. Frailey, Atsuhiro Goto, André Ivanov, Dejan S. Milojicic, Paolo Montuschi, Jane Chu Prey, Charlene ( Chuck ) J. Walrad Term Expiring 2014: Jose Ignacio Castillo Velazquez, David. S. Ebert, Hakan Erdogmus, Gargi Keeni, Fabrizio Lombardi, Hironori Kasahara, Arnold N. Pears Term Expiring 2015: Ann DeMarle, Cecilia Metra, Nita Patel, Diomidis Spinellis, Phillip Laplante, Jean- Luc Gaudiot, Stefano Zanero EXECUTIVE STAFF Executive Director: Angela R. Burgess; Associate Executive Director & Director, Governance: Anne Marie Kelly; Director, Finance & Accounting: John Miller; Director, Information Technology & Services: Ray Kahn; Director, Membership Development: Violet S. Doan; Director, Products & Services: Evan Butterfield; Director, Sales & Marketing: Chris Jensen COMPUTER SOCIETY OFFICES Washington, D.C.: 2001 L St., Ste. 700, Washington, D.C Phone: Fax: [email protected] Los Alamitos: Los Vaqueros Circle, Los Alamitos, CA Phone: [email protected] Membership & Publication Orders Phone: Fax: [email protected] Asia/Pacific: Watanabe Building, Minami- Aoyama, Minato-ku, Tokyo , Japan Phone: Fax: [email protected] IEEE BOARD OF DIRECTORS President: Peter W. Staecker; President-Elect: Roberto de Marca; Past President: Gordon W. Day; Secretary: Marko Delimar; Treasurer: John T. Barr; Director & President, IEEE-USA: Marc T. Apter; Director & President, Standards Association: Karen Bartleson; Director & VP, Educational Activities: Michael R. Lightner; Director & VP, Membership and Geographic Activities: Ralph M. Ford; Director & VP, Publication Services and Products: Gianluca Setti; Director & VP, Technical Activities: Robert E. Hebner; Director & Delegate Division V: James W. Moore; Director & Delegate Division VIII: Roger U. Fujii revised 22 Jan IEEE Cloud Computing May/June 2013
27 IEEE CloudCom th IEEE International Conference on Cloud Computing Technology and Science 2 5 December 2013 Bristol, United Kingdom The Cloud is a natural evolution of distributed computing and of the widespread adoption of virtualization and serviceoriented architecture (SOA). In cloud computing, IT-related capabilities and resources are provided as services, via the Internet and on-demand, accessible without requiring detailed knowledge of the underlying technology. The IEEE International Conference and Workshops on Cloud Computing Technology and Science, steered by the Cloud Computing Association, aim to bring together researchers who work on cloud computing and related technologies. Register today!
28 The Community for Technology Leaders Focused on Your Future Now when you join or renew your IEEE Computer Society membership, you can choose a membership package focused specifically on advancing your career: In addition to receiving your monthly issues of Computer magazine, hundreds of online courses and books, and savings on publications and conferences, each package includes never-before-offered benefits: Software and Systems includes IEEE Software Digital Edition Information and Communication Technologies (ICT) includes IT Professional Digital Edition Security and Privacy includes IEEE Security & Privacy Digital Edition Computer Engineering includes IEEE Micro Digital Edition A digital edition of the most-requested leading publication specific to your interest A monthly digital newsletter developed exclusively for your focus area Your choice of three FREE webinars from the extensive IEEE Computer Society collection Downloads of 12 free articles of your choice from the IEEE Computer Society Digital Library (CSDL) Discounts on training courses specific to your focus area Join or renew today at
Accountability in Cloud Computing An Introduction to the Issues, Approaches, and Tools
Accountability in Cloud Computing An Introduction to the Issues, Approaches, and Tools Nick Papanikolaou, Cloud and Security Lab, HP Labs Europe [email protected] With special thanks to Nick Wainwright and Siani
Security Issues in Cloud Computing
Security Issues in Computing CSCI 454/554 Computing w Definition based on NIST: A model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources
Privacy and Data Protection
Hewlett-Packard Company 3000 Hanover Street Palo Alto, CA 94304 hp.com HP Policy Position Privacy and Data Protection Current Global State of Privacy and Data Protection The rapid expansion and pervasiveness
AskAvanade: Answering the Burning Questions around Cloud Computing
AskAvanade: Answering the Burning Questions around Cloud Computing There is a great deal of interest in better leveraging the benefits of cloud computing. While there is a lot of excitement about the cloud,
Capturing the New Frontier:
Capturing the New Frontier: How Software Security Unlocks the Power of Cloud Computing Executive Summary Cloud computing is garnering a vast share of IT interest. Its promise of revolutionary cost savings
The problem of cloud data governance
The problem of cloud data governance Vasilis Tountopoulos, Athens Technology Center S.A. (ATC) CSP EU Forum 2014 - Thursday, 22 nd May, 2014 Focus on data protection in the cloud Why data governance in
Accelerate Your Enterprise Private Cloud Initiative
Cisco Cloud Comprehensive, enterprise cloud enablement services help you realize a secure, agile, and highly automated infrastructure-as-a-service (IaaS) environment for cost-effective, rapid IT service
Assessing Risks in the Cloud
Assessing Risks in the Cloud Jim Reavis Executive Director Cloud Security Alliance Agenda Definitions of Cloud & Cloud Usage Key Cloud Risks About CSA CSA Guidance approach to Addressing Risks Research
ITL BULLETIN FOR JUNE 2012 CLOUD COMPUTING: A REVIEW OF FEATURES, BENEFITS, AND RISKS, AND RECOMMENDATIONS FOR SECURE, EFFICIENT IMPLEMENTATIONS
ITL BULLETIN FOR JUNE 2012 CLOUD COMPUTING: A REVIEW OF FEATURES, BENEFITS, AND RISKS, AND RECOMMENDATIONS FOR SECURE, EFFICIENT IMPLEMENTATIONS Shirley Radack, Editor Computer Security Division Information
A Study on Service Oriented Network Virtualization convergence of Cloud Computing
A Study on Service Oriented Network Virtualization convergence of Cloud Computing 1 Kajjam Vinay Kumar, 2 SANTHOSH BODDUPALLI 1 Scholar(M.Tech),Department of Computer Science Engineering, Brilliant Institute
Top 10 Cloud Risks That Will Keep You Awake at Night
Top 10 Cloud Risks That Will Keep You Awake at Night Shankar Babu Chebrolu Ph.D., Vinay Bansal, Pankaj Telang Photo Source flickr.com .. Amazon EC2 (Cloud) to host Eng. Lab testing. We want to use SalesForce.com
Future of Cloud Computing. Irena Bojanova, Ph.D. UMUC, NIST
Future of Cloud Computing Irena Bojanova, Ph.D. UMUC, NIST No Longer On The Horizon Essential Characteristics On-demand Self-Service Broad Network Access Resource Pooling Rapid Elasticity Measured Service
Security Issues in Cloud Computing
Security Issues in Cloud Computing Dr. A. Askarunisa Professor and Head Vickram College of Engineering, Madurai, Tamilnadu, India N.Ganesh Sr.Lecturer Vickram College of Engineering, Madurai, Tamilnadu,
Expert Reference Series of White Papers. Understanding NIST s Cloud Computing Reference Architecture: Part II
Expert Reference Series of White Papers Understanding NIST s Cloud Computing Reference Architecture: Part II [email protected] www.globalknowledge.net Understanding NIST s Cloud Computing Reference
Concurrent Technologies Corporation (CTC) is an independent, nonprofit, applied scientific research and development professional services
Concurrent Technologies Corporation (CTC) is an independent, nonprofit, applied scientific research and development professional services organization providing innovative management and technology-based
Seeing Though the Clouds
Seeing Though the Clouds A PM Primer on Cloud Computing and Security NIH Project Management Community Meeting Mark L Silverman Are You Smarter Than a 5 Year Old? 1 Cloud First Policy Cloud First When evaluating
Cloud Computing for SCADA
Cloud Computing for SCADA Moving all or part of SCADA applications to the cloud can cut costs significantly while dramatically increasing reliability and scalability. A White Paper from InduSoft Larry
Tufts University. Department of Computer Science. COMP 116 Introduction to Computer Security Fall 2014 Final Project. Guocui Gao Guocui.gao@tufts.
Tufts University Department of Computer Science COMP 116 Introduction to Computer Security Fall 2014 Final Project Investigating Security Issues in Cloud Computing Guocui Gao [email protected] Mentor:
HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT
HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT A Review List This paper was put together with Security in mind, ISO, and HIPAA, for guidance as you move into a cloud deployment Dr.
1 Introduction. 2 What is Cloud Computing?
1 Introduction Table of Contents 1 Introduction 2 What is Cloud Computing? 3 Why is Cloud Computing important? 4 Why Cloud deployments fail? 5 Holistic Approach to cloud computing implementation 6 Conclusion
Cloud Security Who do you trust?
Thought Leadership White Paper Cloud Computing Cloud Security Who do you trust? Nick Coleman, IBM Cloud Security Leader Martin Borrett, IBM Lead Security Architect 2 Cloud Security Who do you trust? Cloud
Module 1: Facilitated e-learning
Module 1: Facilitated e-learning CHAPTER 3: OVERVIEW OF CLOUD COMPUTING AND MOBILE CLOUDING: CHALLENGES AND OPPORTUNITIES FOR CAs... 3 PART 1: CLOUD AND MOBILE COMPUTING... 3 Learning Objectives... 3 1.1
The Magical Cloud. Lennart Franked. Department for Information and Communicationsystems (ICS), Mid Sweden University, Sundsvall.
The Magical Cloud Lennart Franked Department for Information and Communicationsystems (ICS), Mid Sweden University, Sundsvall. 2014-10-20 Lennart Franked (MIUN IKS) The Magical Cloud 2014-10-20 1 / 35
White Paper on CLOUD COMPUTING
White Paper on CLOUD COMPUTING INDEX 1. Introduction 2. Features of Cloud Computing 3. Benefits of Cloud computing 4. Service models of Cloud Computing 5. Deployment models of Cloud Computing 6. Examples
The Need for Service Catalog Design in Cloud Services Development
The Need for Service Catalog Design in Cloud Services Development The purpose of this document: Provide an overview of the cloud service catalog and show how the service catalog design is an fundamental
Cloud Computing: Opportunities, Challenges, and Solutions. Jungwoo Ryoo, Ph.D., CISSP, CISA The Pennsylvania State University
Cloud Computing: Opportunities, Challenges, and Solutions Jungwoo Ryoo, Ph.D., CISSP, CISA The Pennsylvania State University What is cloud computing? What are some of the keywords? How many of you cannot
OWASP Chapter Meeting June 2010. Presented by: Brayton Rider, SecureState Chief Architect
OWASP Chapter Meeting June 2010 Presented by: Brayton Rider, SecureState Chief Architect Agenda What is Cloud Computing? Cloud Service Models Cloud Deployment Models Cloud Computing Security Security Cloud
Logical Data Models for Cloud Computing Architectures
Logical Data Models for Cloud Computing Architectures Augustine (Gus) Samba, Kent State University Describing generic logical data models for two existing cloud computing architectures, the author helps
Cloud Computing and Security Risk Analysis Qing Liu Technology Architect STREAM Technology Lab [email protected]
Cloud Computing and Security Risk Analysis Qing Liu Technology Architect STREAM Technology Lab [email protected] 1 Disclaimers This presentation provides education on Cloud Computing and its security
Accountability: Data Governance for the Evolving Digital Marketplace 1
Accountability: Data Governance for the Evolving Digital Marketplace 1 1 For the past three years, the Centre for Information Policy Leadership at Hunton & Williams LLP has served as secretariat for the
CLOUD COMPUTING GUIDELINES FOR LAWYERS
INTRODUCTION Legal practices are increasingly using cloud storage and software systems as an alternative to in-house data storage and IT programmes. The cloud has a number of advantages particularly flexibility
Federal Cloud Computing Initiative Overview
Federal Cloud Computing Initiative Overview Program Status To support the Federal Cloud Computing Direction and Deployment Approach, the ITI Line of Business PMO has been refocused as the Cloud Computing
Data Protection Act 1998. Guidance on the use of cloud computing
Data Protection Act 1998 Guidance on the use of cloud computing Contents Overview... 2 Introduction... 2 What is cloud computing?... 3 Definitions... 3 Deployment models... 4 Service models... 5 Layered
Summary of responses to the public consultation on Cloud computing run by CNIL from October to December 2011 and analysis by CNIL
Summary of responses to the public consultation on Cloud computing run by CNIL from October to December 2011 and analysis by CNIL 1. Definition of Cloud Computing In the public consultation, CNIL defined
Cloud Computing; What is it, How long has it been here, and Where is it going?
Cloud Computing; What is it, How long has it been here, and Where is it going? David Losacco, CPA, CIA, CISA Principal January 10, 2013 Agenda The Cloud WHAT IS THE CLOUD? How long has it been here? Where
Cloud Computing Technology
Cloud Computing Technology The Architecture Overview Danairat T. Certified Java Programmer, TOGAF Silver [email protected], +66-81-559-1446 1 Agenda What is Cloud Computing? Case Study Service Model Architectures
Cloud Computing: Legal Risks and Best Practices
Cloud Computing: Legal Risks and Best Practices A Bennett Jones Presentation Toronto, Ontario Lisa Abe-Oldenburg, Partner Bennett Jones LLP November 7, 2012 Introduction Security and Data Privacy Recent
6 Cloud computing overview
6 Cloud computing overview 6.1 General ISO/IEC 17788:2014 (E) Cloud Computing Overview Page 1 of 6 Cloud computing is a paradigm for enabling network access to a scalable and elastic pool of shareable
Clouds on the Horizon Cloud Security in Today s DoD Environment. Bill Musson Security Analyst
Clouds on the Horizon Cloud Security in Today s DoD Environment Bill Musson Security Analyst Agenda O Overview of Cloud architectures O Essential characteristics O Cloud service models O Cloud deployment
Managing Cloud Computing Risk
Managing Cloud Computing Risk Presented By: Dan Desko; Manager, Internal IT Audit & Risk Advisory Services Schneider Downs & Co. Inc. [email protected] Learning Objectives Understand how to identify
The NREN s core activities are in providing network and associated services to its user community that usually comprises:
3 NREN and its Users The NREN s core activities are in providing network and associated services to its user community that usually comprises: Higher education institutions and possibly other levels of
Cloud Computing 159.735. Submitted By : Fahim Ilyas (08497461) Submitted To : Martin Johnson Submitted On: 31 st May, 2009
Cloud Computing 159.735 Submitted By : Fahim Ilyas (08497461) Submitted To : Martin Johnson Submitted On: 31 st May, 2009 Table of Contents Introduction... 3 What is Cloud Computing?... 3 Key Characteristics...
International Journal of Scientific & Engineering Research, Volume 6, Issue 5, May-2015 1681 ISSN 2229-5518
International Journal of Scientific & Engineering Research, Volume 6, Issue 5, May-2015 1681 Software as a Model for Security in Cloud over Virtual Environments S.Vengadesan, B.Muthulakshmi PG Student,
Cloud computing: the state of the art and challenges. Jānis Kampars Riga Technical University
Cloud computing: the state of the art and challenges Jānis Kampars Riga Technical University Presentation structure Enabling technologies Cloud computing defined Dealing with load in cloud computing Service
Cloud Computing Contracts. October 11, 2012
Cloud Computing Contracts October 11, 2012 Lorene Novakowski Karam Bayrakal Covering Cloud Computing Cloud Computing Defined Models Manage Cloud Computing Risk Mitigation Strategy Privacy Contracts Best
Why You Should Consider the Cloud
INTERSYSTEMS WHITE PAPER Why You Should Consider the Cloud In 2014, we ll see every major player make big investments to scale up Cloud, mobile, and big data capabilities, and fiercely battle for the hearts
NIST Cloud Computing Reference Architecture & Taxonomy Working Group
NIST Cloud Computing Reference Architecture & Taxonomy Working Group Robert Bohn Information Technology Laboratory June 21, 2011 2 Outline Cloud Background Objective Working Group background NIST Cloud
Awareness, Trust and Security to Shape Government Cloud Adoption
Awareness, Trust and Security to Shape Government Adoption Awareness Trust Security A white paper by: April 1 1 Executive Summary The awareness, trust and security issues that have limited federal government
John Essner, CISO Office of Information Technology State of New Jersey
John Essner, CISO Office of Information Technology State of New Jersey http://csrc.nist.gov/publications/nistpubs/800-144/sp800-144.pdf Governance Compliance Trust Architecture Identity and Access Management
Cloud Computing and Records Management
GPO Box 2343 Adelaide SA 5001 Tel (+61 8) 8204 8773 Fax (+61 8) 8204 8777 DX:336 [email protected] www.archives.sa.gov.au Cloud Computing and Records Management June 2015 Version 1 Version
How To Understand Cloud Usability
Published in proceedings of HCI International 2015 Framework for Cloud Usability Brian Stanton 1, Mary Theofanos 1, Karuna P Joshi 2 1 National Institute of Standards and Technology, Gaithersburg, MD,
Secure Your Cloud and Outsourced Business with Privileged Identity Management
Secure Your Cloud and Outsourced Business with Privileged Identity Management Table of Contents Executive Summary... 3 Understanding Privilege... 3 Do All Service Providers Get It?... 5 Managing Privilege
journey to a hybrid cloud
journey to a hybrid cloud Virtualization and Automation VI015SN journey to a hybrid cloud Jim Sweeney, CTO GTSI about the speaker Jim Sweeney GTSI, Chief Technology Officer 35 years of engineering experience
INFORMATION SECURITY GUIDE. Cloud Computing Outsourcing. Information Security Unit. Information Technology Services (ITS) July 2013
INFORMATION SECURITY GUIDE Cloud Computing Outsourcing Information Security Unit Information Technology Services (ITS) July 2013 CONTENTS 1. Background...2 2. Legislative and Policy Requirements...3 3.
When Security, Privacy and Forensics Meet in the Cloud
When Security, Privacy and Forensics Meet in the Cloud Dr. Michaela Iorga, Senior Security Technical Lead for Cloud Computing Co-Chair, Cloud Security WG Co-Chair, Cloud Forensics Science WG March 26,
How To Choose A Cloud Computing Solution
WHITE PAPER How to choose and implement your cloud strategy INTRODUCTION Cloud computing has the potential to tip strategic advantage away from large established enterprises toward SMBs or startup companies.
Cloud Computing Security Considerations
Cloud Computing Security Considerations Roger Halbheer, Chief Security Advisor, Public Sector, EMEA Doug Cavit, Principal Security Strategist Lead, Trustworthy Computing, USA January 2010 1 Introduction
Securing The Cloud. Foundational Best Practices For Securing Cloud Computing. Scott Clark. Insert presenter logo here on slide master
Securing The Cloud Foundational Best Practices For Securing Cloud Computing Scott Clark Agenda Introduction to Cloud Computing What is Different in the Cloud? CSA Guidance Additional Resources 2 What is
Cloud Security Introduction and Overview
Introduction and Overview Klaus Gribi Senior Security Consultant [email protected] May 6, 2015 Agenda 2 1. Cloud Security Cloud Evolution, Service and Deployment models Overview and the Notorious
Planning the Migration of Enterprise Applications to the Cloud
Planning the Migration of Enterprise Applications to the Cloud A Guide to Your Migration Options: Private and Public Clouds, Application Evaluation Criteria, and Application Migration Best Practices Introduction
Cloud Security considerations for business adoption. Ricci IEONG CSA-HK&M Chapter
Cloud Security considerations for business adoption Ricci IEONG CSA-HK&M Chapter What is Cloud Computing? Slide 2 What is Cloud Computing? My Cloud @ Internet Pogoplug What is Cloud Computing? Compute
Cloud security architecture
ericsson White paper Uen 284 23-3244 January 2015 Cloud security architecture from process to deployment The Trust Engine concept and logical cloud security architecture presented in this paper provide
Bringing the Cloud into Focus. A Whitepaper by CMIT Solutions and Cadence Management Advisors
Bringing the Cloud into Focus A Whitepaper by CMIT Solutions and Cadence Management Advisors Table Of Contents Introduction: What is The Cloud?.............................. 1 The Cloud Benefits.......................................
How To Protect Your Cloud Computing Resources From Attack
Security Considerations for Cloud Computing Steve Ouzman Security Engineer AGENDA Introduction Brief Cloud Overview Security Considerations ServiceNow Security Overview Summary Cloud Computing Overview
CLOUD COMPUTING FOR SMALL- AND MEDIUM-SIZED ENTERPRISES:
CLOUD COMPUTING FOR SMALL- AND MEDIUM-SIZED ENTERPRISES: Privacy Responsibilities and Considerations Cloud computing is the delivery of computing services over the Internet, and it offers many potential
The Sumo Logic Solution: Security and Compliance
The Sumo Logic Solution: Security and Compliance Introduction With the number of security threats on the rise and the sophistication of attacks evolving, the inability to analyze terabytes of logs using
Flying into the Cloud: Do You Need a Navigator? Services. Colin R. Chasler Vice President Solutions Architecture Dell Services Federal Government
Services Flying into the Cloud: Do You Need a Navigator? Colin R. Chasler Vice President Solutions Architecture Dell Services Federal Government Table of Contents Executive Summary... 3 Current IT Challenges...
Overview of Cloud Computing and Cloud Computing s Use in Government Justin Heyman CGCIO, Information Technology Specialist, Township of Franklin
Overview of Cloud Computing and Cloud Computing s Use in Government Justin Heyman CGCIO, Information Technology Specialist, Township of Franklin Best Practices for Security in the Cloud John Essner, Director
INTRODUCTION TO CLOUD COMPUTING CEN483 PARALLEL AND DISTRIBUTED SYSTEMS
INTRODUCTION TO CLOUD COMPUTING CEN483 PARALLEL AND DISTRIBUTED SYSTEMS CLOUD COMPUTING Cloud computing is a model for enabling convenient, ondemand network access to a shared pool of configurable computing
WRITTEN TESTIMONY OF NICKLOUS COMBS CHIEF TECHNOLOGY OFFICER, EMC FEDERAL ON CLOUD COMPUTING: BENEFITS AND RISKS MOVING FEDERAL IT INTO THE CLOUD
WRITTEN TESTIMONY OF NICKLOUS COMBS CHIEF TECHNOLOGY OFFICER, EMC FEDERAL ON CLOUD COMPUTING: BENEFITS AND RISKS MOVING FEDERAL IT INTO THE CLOUD BEFORE THE COMMITTEE ON OVERSIGHT AND GOVERNMENT REFORM
IBM 000-281 EXAM QUESTIONS & ANSWERS
IBM 000-281 EXAM QUESTIONS & ANSWERS Number: 000-281 Passing Score: 800 Time Limit: 120 min File Version: 58.8 http://www.gratisexam.com/ IBM 000-281 EXAM QUESTIONS & ANSWERS Exam Name: Foundations of
Customer Security Issues in Cloud Computing
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology ISSN 2320 088X IJCSMC, Vol. 2, Issue.
NIST Cloud Computing Reference Architecture
NIST Cloud Computing Reference Architecture Version 1 March 30, 2011 2 Acknowledgements This reference architecture was developed and prepared by Dr. Fang Liu, Jin Tong, Dr. Jian Mao, Knowcean Consulting
Federal Aviation Administration. efast. Cloud Computing Services. 25 October 2012. Federal Aviation Administration
efast Cloud Computing Services 25 October 2012 1 Bottom Line Up Front The FAA Cloud Computing Vision released in 2012 identified the agency's road map to meet the Cloud First Policy efast must provide
CLOUD TECHNOLOGY IMPLEMENTATION/SECURITY
1 CLOUD TECHNOLOGY IMPLEMENTATION/SECURITY Torrell Griffin 2 Cloud Technology Implementation/Risk Mitigation The purpose of this report, in essence, is to define cloud technology as well as describe some
The Software-defined Data Center in the Enterprise
The Software-defined Data Center in the Enterprise A Cloud Report by Ben Kepes This report underwitten by: NIMBOXX The Software-defined Data Center in the Enterprise 02/12/2015 Table of Contents 1. Executive
Cloud-Security: Show-Stopper or Enabling Technology?
Cloud-Security: Show-Stopper or Enabling Technology? Fraunhofer Institute for Secure Information Technology (SIT) Technische Universität München Open Grid Forum, 16.3,. 2010, Munich Overview 1. Cloud Characteristics
ISSUE BRIEF. Cloud Security for Federal Agencies. Achieving greater efficiency and better security through federally certified cloud services
ISSUE BRIEF Cloud Security for Federal Agencies Achieving greater efficiency and better security through federally certified cloud services This paper is intended to help federal agency executives to better
security in the cloud White Paper Series
security in the cloud White Paper Series 2 THE MOVE TO THE CLOUD Cloud computing is being rapidly embraced across all industries. Terms like software as a service (SaaS), infrastructure as a service (IaaS),
Cloud Computing in the Enterprise An Overview. For INF 5890 IT & Management Ben Eaton 24/04/2013
Cloud Computing in the Enterprise An Overview For INF 5890 IT & Management Ben Eaton 24/04/2013 Cloud Computing in the Enterprise Background Defining the Cloud Issues of Cloud Governance Issue of Cloud
I D C V E N D O R S P O T L I G H T
I D C V E N D O R S P O T L I G H T L e ve r a g i n g N e x t - Generation Servi c e D e l i ve r y: T h e M o ve t o C l o ud Servi c e s November 2011 Adapted from CSC Innoventure 2011: Good Things
Cloud Security for Federal Agencies
Experience the commitment ISSUE BRIEF Rev. April 2014 Cloud Security for Federal Agencies This paper helps federal agency executives evaluate security and privacy features when choosing a cloud service
Written Testimony. Mark Kneidinger. Director, Federal Network Resilience. Office of Cybersecurity and Communications
Written Testimony of Mark Kneidinger Director, Federal Network Resilience Office of Cybersecurity and Communications U.S. Department of Homeland Security Before the U.S. House of Representatives Committee
Fundamental Concepts and Models
Chapter 4: Fundamental Concepts and Models Nora Almezeini MIS Department, CBA, KSU From Cloud Computing by Thomas Erl, Zaigham Mahmood, and Ricardo Puttini(ISBN: 0133387526) Copyright 2013 Arcitura Education,
Federation of Cloud Computing Infrastructure
IJSTE International Journal of Science Technology & Engineering Vol. 1, Issue 1, July 2014 ISSN(online): 2349 784X Federation of Cloud Computing Infrastructure Riddhi Solani Kavita Singh Rathore B. Tech.
NETWORK ACCESS CONTROL AND CLOUD SECURITY. Tran Song Dat Phuc SeoulTech 2015
NETWORK ACCESS CONTROL AND CLOUD SECURITY Tran Song Dat Phuc SeoulTech 2015 Table of Contents Network Access Control (NAC) Network Access Enforcement Methods Extensible Authentication Protocol IEEE 802.1X
