1 Whitepaper Performance Testing Meets the Cloud sqs.com Opportunities and Challenges Author: Victor Czenter Senior Consultant Technical Quality SQS Software Quality Systems AG Germany Published: August 2013 SQS the world s leading specialist in software quality
2 Performance Testing Meets the Cloud 2 Victor Czenter Senior Consultant Technical Quality Victor Czenter is a computer science graduate who, after completing his studies at Munich s Ludwig-Maximilians University, started his career in quality assurance, with the emphasis on performance engineering and test automation, as the technical designer of an innovative test management tool. Since 2010 he has consulted SQS s customers in the spheres of test design and management, load and performance test management, and test automation; he has also organised and conducted training sessions. Within SQS s Excellence Research department he investigates especially viable approaches and trends and standardises them for their introduction into the Group s service portfolio.
3 Performance Testing Meets the Cloud 3 Contents 1. Management Summary Introduction Cloud Computing Performance Performance Testing Market Current Status and Outlook Cloud Testing Cloud as Test Item Cloud as Test Tool Cloud as Test Environment Cloud as Test Logistics Conclusion and Outlook Bibliographical References...18
4 Performance Testing Meets the Cloud 4 1. Management Summary Over the last few years, the trend of Cloud Testing has evolved from the stage of hype to a trough of disillusionment to a seriously adapted IT topic which has not yet come to full maturity. Many companies offer cloud-specific or cloud-based solutions, not only for enterprises but also for private users. These solutions are available as full-blown infrastructures, complete platforms or ready-to-use software services. The promoted advantages of these cloud types generally predict sufficient capacity, high availability and reliability, as well as high overall performance even in extreme situations of load. Since the existence of the cloud does not affect the user s expectation of IT quality, the assurance of and testing for quality is still an extremely important topic. The quality attributes to be tested are the same as used for typical web and onsite applications; how-ever, the weightings for these attributes have slightly changed. Some have become more important (e.g. security), some have remained the same (e.g. functionality), and some seem to have become less important such as performance, as the cloud by definition is scalable. In contrast to this widespread view, the purpose of the present paper is to demonstrate that Performance Testing is as important as ever: even if performance has been defined and promised via service level agreements, it has to be tested in order to substitute hard facts for assumptions. Performance Testing is crucial for cloud services and, vice versa, testing can benefit from the cloud: cloud services are perfectly suited to deliver scalable test tool environments which are especially necessary for the different types of Performance Testing. And these advantages can be leveraged not only for Performance Testing of cloud services. In summary, it is shown that the relevance of Performance Testing has not changed due to the existence of cloud services. Moreover, taking advantage of cloud services for Cloud Testing makes Performance Testing much easier and more efficient. It is a classic win-win situation.
5 Performance Testing Meets the Cloud 5 2. Introduction Some years ago, Cloud Computing was heavily propagated by opinion leaders on all communication channels as the big IT revolution. Today, the cloud has reached reality and established a new type of service delivery. Increasingly, traditional off-the-shelf tool providers are changing their licensing model from selling node-locked licences to scalable payper-use models without any need for onsite installation or maintenance (e.g. Microsoft Office 365). And the cloud not only affects business but all our daily private work as well: social networks, providers or big data exchange platforms all of them are used as cloud services, available on many different platforms (e.g. mobile phones, tablet computers, desktops). Clouds by definition are highly attractive IT systems. You only pay while you actually use them, and you only need an Internet connection to do so. Besides, they are perfectly scalable, so if many users are using the service, the cloud automatically utilises a greater number of computers to ensure high performance. This promise of scalability is usually underpinned by corresponding key performance indicators ensuring acceptable time frames for each request. A naive user of a cloud service will rely on these promises and consider reducing Performance Testing because there is nothing left to test. Furthermore, the available testing budget on the provider as well as the user side will refocus on quality attributes that seem more important in the cloud context: the number one concern being security testing since data from different companies might share the same hardware and could interfere with each other, giving rise to security issues. Unfortunately, reality differs: every week, the news report about downtimes, low availability, and inacceptable response times. Regardless of the cloud s promise, performance still is a valid and important quality attribute and as such requires testing. Moreover, Performance Testing itself can benefit from the cloud. In contrast to the testing of quality attributes like functionality or security, Performance Testing has always been strongly dependent on sophisticated tools. It needs to generate high volumes of test data to create a substantial workload and be more realistic just testing the time behaviour of a single individual cannot be sufficiently conclusive. Hence, Performance Testing usually requires a complex and expensive test infrastructure. But here the cloud comes into play again: it is perfectly suited to provide this type of infrastructure. Rather than purchasing and installing hard- and software onsite, the performance tester can simply buy the services from the cloud. So even if the concepts behind Performance Testing are independent from the location of the system under test (onsite vs. cloud), the process itself can benefit considerably from the cloud. The present paper will explain the two different aspects of how the cloud affects Performance Testing. On the one hand, it deals with Performance Testing of the cloud as the system under test; on the other, with Performance Testing making use of the cloud as a provider of essential test infrastructure. Interestingly enough, both aspects can easily be combined, the motto being: Use the cloud to test the cloud.
6 Performance Testing Meets the Cloud Cloud Computing According to Gartner, the cloud is defined as a style of computing in which scalable and elastic IT-enabled capabilities are delivered as a service to external customers using Internet technologies. Definition 1: Cloud Computing (Gartner, 2009) This definition refers to the following characteristics of a cloud: scalability, a large amount of resources, and offering end-user services over the Internet. More specifically, the characteristics can be explained like this: Service-oriented: The focus is on the what instead of the how. The cloud delivers a service, i.e. a value add, to the customer, who should have a black-box view on it. Elastic: Any payment of cloud services is output-based. The more you get the more you pay, i.e. the cloud has to implement a pay-per-use concept. Scalable: Strongly related to the black-box view on the cloud is its capability to scale. If you need more service throughput, just order it. There is no need to care for internals or administration, the cloud is scalable by definition. Internet-connected: In order to use the cloud, there is one (and only one) technical prerequisite you must have an Internet connection. No proprietary connections, no heavy preinstallations, the service is simply delivered via the Internet. Computer-based: The service operation itself, which is delivered through the cloud, must depend on IT systems with very little human interaction. These IT systems might include other cloud services. Regarding the type of service the cloud provides, the following types can be identified: Infrastructure as a Service (IaaS) From the cloud provider s view, this approach is the easiest and regards delivery of a processor and network infrastructure. Here, the focus is on physical or virtual machines, firewalls, load balancers, and network infrastructure. All these deliverables are located in a computer centre. The provider must assure that the functionality as well as the calibration of IP and network infrastructure is working. Depending on the contract, the user is able to install the operating system, software or patches on the delivered infrastructure. If additional processors should be required, it will be easy to provide them. Examples of IaaS clouds are e.g. CloudFormation by Amazon, Google Compute Engine, and RightScale. Platform as a Service (PaaS) The providers of PaaS deliver a working platform which typically contains operational systems, developing environments, databases, as well as web servers. Developers of applications can develop and execute their applications on these platforms without buying, maintaining, or licensing the platform. Examples are e.g. Amazon Elastic Beanstalk, Heroku, EngineYard, Google App Engine and Microsoft Azure.
7 Performance Testing Meets the Cloud 7 Software as a Service (SaaS) The SaaS model assures the provision of executable applications in the cloud which are directly usable for end users in the role of a client asking for a complete service. At this point, the cloud shows one of its typical characteristics: elasticity. The solution, taking place in the background, manifests in virtualisation. Users are distributed to additional virtual machines and receive constant response times, irrespective of the number of concurrent users. The payment is graded and made monthly, annually or as per use. Well-known services are e.g. Google Apps, Quickbooks Online, and Salesforce.com. These three types of cloud are not necessarily independent but can be modelled in the layered architecture visualised in Figure 1. Typically, clients access SaaS clouds via web browsers, mobile apps, thin clients, etc. The service itself may use a specific platform based on a particular infrastructure Performance One of the most important quality attributes with respect to the business efficiency and user acceptance of software systems is performance. Especially in terms of modern cloud-based IT solutions and application systems, the response times as well as the utilisation rate resources required are crucial from the point of view of the end user as well as the provider. Cloud Clients Web browsers, mobile apps, thin clients, explore clients, etc. Application SaaS Application platforms, CRM, servers, office applications, virtual desktops, games, etc. Platform PaaS Web servers, application servers, databases, development platforms Infrastructure IaaS Virtualisation, servers, storage, NAS, load balancers, network infrastructures Figure 1: Layered architecture of cloud types
8 Performance Testing Meets the Cloud 8 ISO (ISO, 2010) defines performance as one of eight different product quality attributes. The performance of a system is determined by the amount of resources used under given conditions. These resources can include other software products, the soft- and hardware configuration of the system, and materials (e.g. printing paper, storage media). The different subcharacteristics of performance (which determine performance efficiency) are defined as follows: Time behaviour: This describes the degree to which the response and processing times and the throughput rates of a product or system, when it is performing its functions, meet the requirements. Resource utilisation: This describes the degree to which the amounts and types of resources used by a product or system, when it is performing its functions, meet the requirements. Capacity: This is the degree to which the maximum limits of a product or system parameter meet the requirements. Parameters may include the number of items that can be stored, the number of concurrent users, the communication bandwidth, the throughput of transactions, and the size of the database. In addition to the product quality attributes, efficiency is also listed as an attribute of quality in use, analysing resources expended in relation to the accuracy and completeness with which users achieve their goals (here human resources are included; for certain types of Performance Testing they have to be simulated). Performance and load tests are typically executed as black-box tests (i.e. no internals are considered) and are supposed to check whether components or respective application systems meet the planned performance requirements under load conditions. Furthermore, they help identify performance issues Performance Testing Due to various technical factors and considerable requirements in terms of infrastructure, performance tests checking the performance efficiency (see above) of a system are cost- and time-intensive procedures. In these tests, real software system situations are simulated, and the response time and utilisation rate of resources are measured. Aiming for the best possible results, performance and load tests have to be executed in a production-like environment. This makes Performance Testing a very complex and cost-intensive issue. For the preparation and execution of the tests, questions around targets and boundaries of measurement values and identified types of issues need to be defined. In this context, an analysis of business-critical use cases as well as a prioritisation of the most frequently used components is recommended. Furthermore, questions regarding load profiles as well as numerical data and data volumes have to be answered. Last but not least, hardware resources which ought to be available, as well as necessary metering points, must be defined. Performance tests are executed using specialised tools, e.g. load generators or monitoring tools. During the test, not only the user s view on response times but also the system s view on the degree of resource utilisation are recorded and examined. By correlating the measured data, performance bottlenecks and issues may be identified and analysed. Depending on the objective as well as the testing conditions, there are different types of performance tests:
9 Performance Testing Meets the Cloud 9 Response Performance Test: This performance test verifies the duration and responsiveness of an IT system, taking account of the resources. LOAD Response Performance Test: This test verifies the response times of an IT system under load. Batch Performance Test: Batch tests are executed to measure the duration of the batch process, so time aspects and resource demands are considered. Typical tests will verify the duration of migrating a database for legacy systems or relate to endof-the-day, end-of-the-month or endof-the-year batch jobs. LOAD Batch Performance Test: This test verifies the duration of a batch process by monitoring the system s resources. Stability Test: The test is being executed to verify the stability of a system over a longer period of time. Due to programming bugs, memory leak effects may occur which lead to exhaustive memory use, and the system will crash after some time. LOAD Stability Test: This test verifies the stability of a system under load conditions with regard to memory leaks and system recovery.
10 Performance Testing Meets the Cloud 10 Scalability Load Test: This test verifies whether a system is scalable for increasing load a behaviour typical of online systems where the number of current users is unpredictable and may constantly increase. LOAD Scalability Load Test: This test verifies whether a system is scalable under increasing load conditions. Stress Test: This test generates a heavy load for a system with the aim of determining the system s boundaries. It simulates an unusual user behaviour situation, e.g. for peak times. The test is frequently combined with failover testing. LOAD Overload range Stress Test: This test generates a significantly heavier load to verify the system s behaviour and limits under stress. Failover Test: This test performs the planned failure of a component or part of the system. Failover implies that the load is being handed to other available components to take over the job of the failed component(s). Both sides (source component and target component) need extra resources to be failover-ready. The test also verifies the configuration of the operational modes of the machines involved. LOAD Point of failure Point of failure Failover Test: This test checks the behaviour during system or component failure the load must be transferred to other system components, which requires an extra load or additional resources.
11 Performance Testing Meets the Cloud Market Current Status and Outlook Even if Cloud Computing is seen as just another kind of outsourcing, the demand for cloud-specific and cloud-based services is enormous. This is due to the attractive cost models offered by service providers. Depending on the industry and the core business, it is often better not to buy and subsequently maintain the infrastructure oneself but to make use of cloud services to save administrative costs. Thus, using the cloud may be regarded as a specific type of outsourcing. A recent survey showed that the trend of outsourcing is getting stronger all the time (see Figure 2). Targets of outsourcing will increasingly be located within the cloud recent research confirms the continuous growth of cloud services. The volume estimated for 2016 is about 5 billion (see Figure 3). Outsourcing as Percentage of Total IT Spending: Median (Among organisations that outsource) 6 % 4 % 2 % 0 % 3.8% % 7.1% Figure 2: Budget spent on outsourcing in (Leclerque, 2010) 10,000 8,000 Cloud Technology Cloud Integration & Consulting Cloud Service (SaaS, PaaS, IaaS) 5, ,000 3, , ,000 2, , , , , , , , , , , Figure 3: Investment and capital expenditure for Cloud Computing (Velten & Janata, 2012)
12 Performance Testing Meets the Cloud Cloud Testing Testing as an umbrella term for all testing activities, such as Performance Testing, is not only important for traditional IT systems but also for the cloud in various respects. As defined by the ISTQB, testing is the process consisting of all life cycle activities, both static and dynamic, concerned with planning, preparation and evaluation of software products and related work products to determine that they satisfy specified requirements, to demonstrate that they are fit for purpose and to detect defects. Definition 2: Testing (ISTQB/GTB, 2010) Using this definition, the different kinds of interaction between Cloud Computing and testing can be mapped in four types specifying how the former influences the latter. In concrete contexts, these types may overlap if the cloud takes on several roles at once. Distinguished by the way they are utilised for testing, the four types of cloud are the following: 1. Cloud as system under test: The cloud itself or its relevant output is the software product which is to be tested. This means that the cloud as system under test is deployed and executed within the cloud. 2. Cloud as testware utility: The cloud or services which can be used via the cloud are utilised for testing another IT system. 3. Cloud as test environment: The software product or its relevant output is executed within the cloud for testing reasons. This means that the test system is deployed and executed within the cloud but subsequent operation will not necessarily include the cloud. 4. Cloud as test logistics: Testing is delivered as a cloud service in the sense of typical cloud characteristics, i.e. scalable, elastic, etc. cf This concept of Cloud Testing is outlined in Figure 4. The different types and their possibilities are described below in more detail; then, each type is assessed in terms of its relevance for Performance Testing Cloud as Test Item The cloud is becoming increasingly important as a production platform, and for this reason systems are now often developed directly in the cloud. But the fact that the applications run there does not necessarily mean that they are flawless. Testing applications in the cloud should therefore receive the same attention as testing applications onsite (e.g. within a particular IT infrastructure or an appropriate environment). Testing the cloud is not fundamentally different from testing onsite applications. The general steps (test planning, test analysis, and test design as a prerequisite for test implementation and test execution) apply in both cases (Spillner & Linz, 2010). Differences can only be identified on the next level of granularity. At that level, during test planning and test analysis, the cloud-specific quality finger print is of particular importance. This quality finger print above all manifests itself in two respects:
13 Performance Testing Meets the Cloud 13 Cloud meets testing as system under test 2. as testware utility 4. as test logistics 3. as test environment Figure 4: The four types of Cloud Testing As a matter of principle, a cloud application has to offer similar product features as a traditional application. However, for special quality attributes (e.g. those mentioned in ISO 25010) the focus and weighting must be adjusted. The aspect of security requires far more attention when testing a cloud application as opposed to a normal application. This is due to the fact that a cloud application can be accessed at any time from different computers via an Internet connection. In addition, the hardware providing a special cloud service is used for many different cloud services, each having various sets of users. Security has to ensure that a specific cloud service can only be accessed by the corresponding set of authorised users. Special attention is also required regarding the quality attributes of performance (since there are hardly any opportunities to influence the operational environment via the application owner), portability (in order to minimise interdependencies between specific clouds), and recoverability (because restarts and other cloud accidents cannot be influenced directly; see Sawall, 2012). The infrastructure of a cloud is highly complex, and the weakest chain of the link determines the overall quality. Therefore, testing a cloud always includes testing new items whose quality will decide the general performance: e.g. failover mechanisms, electric power supplies, load balancers, and cache mechanisms. How important these additional test items are becomes clear when one looks at some electricity outages that recently occurred. Here, the business continuity infrastructure obviously had not been tested and so the cloud was not able to deliver in these specific cases of emergency (Sawall, 2012).
14 Performance Testing Meets the Cloud 14 This type of Cloud Testing covers all the Performance Testing activities necessary to check out a system that is delivered via the cloud. And in terms of the different layers of the cloud s architecture, all layers can be targeted as systems under test for Cloud Testing, depending on what shall be delivered as a cloud service Cloud as Test Tool In order to test an IT system one needs a variety of so-called testware, which means artifacts produced during the test process required to plan, design, and execute tests, such as documentation, scripts, inputs, expected outcomes, set-up and clear-up procedures, files, databases, environment, and any additional software or utilities used in testing. Definition 3: Testware (ISTQB / GTB, 2010) Typical examples of such testware on the application or platform layer are test data generators, test management tools, test report generators, test automation tools, etc. In addition, the interfaces to other processes like risk management and incident management have to be served by adequate instruments which have been tested, too. While formerly every test item had to run its own, partly proprietary testware landscape (beginning with installation, updating, and licensing, and ending with preservation, maintenance, and perhaps an announcement), today a variety of such testware can be obtained via the cloud. The advantages of the cloud are obvious: service orientation (since testware is easily available), cost transparency and optimisation (since costs only arise if the testware tool is actually used), as well as scalability (since the testware is also executable during test peaks). Testing supported by testware provided via the cloud will pick up speed in the near future. Nevertheless, it has to be taken into account that to date many companies still own and administrate their own testware. A big motivational boost in this context is the increasing number of mobile devices. Due to their variety, it is almost impossible for companies to ensure availability of all required mobile devices as well as associated testware. Via the cloud, however, the complete portfolio of devices and new testware is technically available for test purposes. Special attention in the context of using the cloud as test tool needs to be given to compliance aspects and to the general requirements for data security. If test data allows inferences to be made about production data and personal data referenced therein, the cloud must meet special requirements, even if the application under test is not executed in the cloud itself as production system. This type of Cloud Testing will boost Performance Testing since the cloud as test tool improves and simplifies the availability of tools required for the latter. Moreover, the increased tool availability will have a positive impact on Performance Testing of traditional onsite applications Cloud as Test Environment IT systems are mainly developed for a specific production environment. Usually, all testing activities are applied on a different platform, defined by the test environment (including the test harness). ISTQB defines test environment and harness as follows:
15 Performance Testing Meets the Cloud 15 The test environment is the environment containing hardware, instrumentation, simulators, software tools, and other support elements needed to conduct a test [after IEEE 610]. Definition 4: Test environment (ISTQB/GTB, 2010) The test harness is a test environment that comprises of stubs and drivers needed to execute a test. Definition 5: Test harness (ISTQB/GTB, 2010) The option of providing the test harness (or parts of it) throughout the cloud is quite similar to testware delivery via the cloud as already discussed in Section 4.2 above: if an application needs to have access to an exchange server, this part of the test harness is already obtainable via the cloud today. However, from the point of view of cloud architecture, the cloud as test environment fits perfectly into the PaaS layer. In addition, it can also be useful to outsource the entire application under test for a specific test run within the cloud, regardless of whether the application afterwards is to be run onsite in a production environment. Advantages of this approach are often reduced costs for licensing (i.e. onsite only for the production system and elastically for the cloud), scalable availability (e.g. to absorb test load peaks), and easier generation of variances (e.g. test environment covering different versions of system software). This type of Cloud Testing will boost Performance Testing since the cloud as test environment improves and simplifies the availability of test environments. In practice, many Performance Testing activities are postponed either due to a lack of test environments or unknown impacts of a test environment on the production system the cloud as a test environment provides the sandbox a performance tester needs. However, the results obtained by Performance Testing in that way are more difficult to interpret: even if the performance tester tries to establish maximum similarity between the test environment (the cloud) and the production system (onsite), the cloud will usually behave differently Cloud as Test Logistics An increasingly innovative, and for the future more relevant, form of testing is Testing as a Service (TaaS), i.e. testing as a cloud service. From a more conceptual point of view, the term Managed Testing Services has so far established itself for this industrialised test factory approach (Computerwoche, 2010). The mapping of Managed Testing Services onto the cloud has been described by Simon & Simon (2012) and follows Gartner s definition of cloud services mentioned above. It basically fits all layers of the cloud architecture depending on the respective building block of the cloud used. The cloud as test logistics therefore focuses on activities which have a high potential for automation (e.g. regression tests based on test automation); which can be delivered in modules with (partly) dedicated responsibility for each module (factory approach, where results rather than internal information are of prime importance); and which exhibit an appropriate scalability and elasticity for a specified minimum size (regarding contract duration and demand volume).
16 Performance Testing Meets the Cloud 16 Using the cloud as test logistics may be of great advantage not only for the provider but also for the client: The client can focus on core competencies (IT normally is only an enabler for business and not an end in itself), and costs for testing are transparent in advance and therefore easy to calculate and budget. Due to scalability effects on the provider s side, testing generally becomes significantly cheaper for the client. The provider of the cloud as test logistics can specialise on testing business to a maximum and establish expert knowledge accordingly. It makes sense for the provider to make available highly specialised tooling as well as tool chains since these can be used repeatedly for different clients. The long-term nature of the cloud as test logistics as well as the scalability effects put cost-intensive resource management (e.g. shortterm sales for single resources) and sales expenses into perspective. This type of Cloud Testing will be the future of Performance Testing in the mid-term. Up to now, most of the performance tests still contain quite a few manual and ad hoc adjustments and customisations, undercutting the factory approach that focuses on standardised and repeatable activities. But, as long as an application has to prove its performance over many updates, i.e. it requires a performance regression test, this type of Cloud Testing should be considered. If the fundamental architecture (as captured by the test harness) does not change, it will make sense for the provider to offer any performance regression test as a cloud service. In this context, it should be mentioned that the cloud as test logistics has some specific requirements not only on the provider s but also on the client s side; there, testing needs to be on an industrial scale and should have reached a specific test maturity. Key parameters like test stages, test items, and depth of testing must have been shaped in an explicitly process-related way before cloud as test logistics is an option. Today, so-called cloud readiness checks can provide adequate transparency. The provider, on the other hand, needs to demonstrate outstanding expertise, particularly to ensure scalability and elasticity. Companies that are newbies in the business of running a service often have to learn this the hard way, with potential consequences for the clients in the form of unexpected non-performance or insufficient output.
17 Performance Testing Meets the Cloud Conclusion and Outlook Even though cloud services promise scalability and elasticity, their performance must be tested. The methods and processes of testing the performance of an application within the cloud do not differ from the Performance Testing of traditional onsite applications. However, there are further opportunities that can be realised in the context of Cloud Computing. Considering the cloud from a performance perspective, three additional points of contact can be identified that may have a great impact on Performance Testing, even if the system under test is traditional and located onsite : Cloud as test tool: Performance Testing always requires a sophisticated tool infrastructure. In the past, this made life difficult for many performance testers because all their tools needed a specific hardware and had to be installed, updated, patched, licensed, and backed up. In many cases, performance was not sufficiently tested, only due to missing test tools. In this area, the cloud will boost Performance Testing: Forget about buying, installing, maintaining, or updating test tools just use them via the cloud. Focus on Performance Testing instead of tools. Cloud as test logistics: From a business perspective, Performance Testing is only a means to an end. The business merely has to ensure that the performance is OK it is not interested in any testing details. In this area again, the cloud will boost Performance Testing: Now that it is possible to offer Performance Testing as a cloud service, the customer can rely on greater transparency and plannability. The business is able to focus on core competencies once again, while the performance tester can concentrate on the factory approach taking responsibility for internal details and promising output. By leveraging these three additional types of Cloud Testing, Performance Testing will be applied more efficiently and more effectively. Cloud as test environment: Performance Testing requires a running application. In the past, this requirement often could not be fulfilled due to a lack of test environments. Either the environment was still under construction, or only a production environment had been planned, or too many testers were queuing up for a single test environment. In this area, the cloud will boost Performance Testing as well: If you need a specific test environment, just use it via the cloud. You need a specific mobile device for Performance Testing? Take it from the cloud. Focus on Performance Testing instead of struggling for test environments.
18 Performance Testing Meets the Cloud Bibliographical References Computerwoche. (2010). Managed Testing Services: Deutsche Firmen lassen IT von Profis testen. ISO. (2010). DIN ISO/IEC Software product Quality Requirements and Evaluation (SQuaRE). ISTQB/GTB. (2010, September). Standardglossar der Testbegriffe, Deutsch/Englisch. Retrieved from German Testing Board: CT_Glossar_DE_EN_V21.pdf Leclerque, K. (2010, September). Outsourcing im Jahr Retrieved June 2012, from Anteil der IT-Ausgaben bei 25: Sawall, A. (2012, June). Golem. Retrieved from Dreifacher Stromausfall verursachte Auszeit: Simon, F., & Simon, D. (2012). IT-Industrialisierung. In Thought Leadership 2012 (S ). Köln. Spillner, A., & Linz, T. (2010). Basiswissen Softwaretest: Aus- und Weiterbildung zum Certified Tester Foundation Level nach ISTQB-Standard (4. Edition Ausg.). dpunkt Verlag. Velten, C., & Janata, S. (2012, March). Neue Cloud-Prognosen der Experton Group Gesamtmarkt wächst weiter stark. Retrieved 07 17, 2012, from Experton Group: neue-cloud-prognosen-der-experton-group.html