Recommendations for Performance Benchmarking
|
|
|
- Isabel Simmons
- 10 years ago
- Views:
Transcription
1 Recommendations for Performance Benchmarking Shikhar Puri Abstract Performance benchmarking of applications is increasingly becoming essential before deployment. This paper covers recommendations and best practices that can be adopted when performance benchmarking of applications is being done. Jan 2008
2 Introduction For an application, it is no longer sufficient to test functionality alone. Testing an application to ensure that it satisfies the desired performance parameters is fast becoming the norm. Performance testing and benchmarking of an application can be performed before deploying the product in the production environment or after the deployment. If the business is expected to grow, the organization should ascertain the application s ability to support the extended business volumes on existing infrastructure and the potential areas of concern or bottlenecks in terms of performance. It is essential to identify the areas where enhancements, tuning and upgrade are required. Performance testing and benchmarking also provide proof of the performance of the product and set a baseline for further enhancement in the application. The need to benchmark performance The objective of performance benchmarking is to eliminate performance bottlenecks prevalent in applications and infrastructure components by identifying and tuning them. This might involve conducting performance tests iteratively on the application against representative production workload and anticipatory data volumes as expected in production after a significant period of time. Performance benchmarking methodology The performance testing lifecycle goes on in parallel to the software development lifecycle and starts with the requirement gathering exercise for the software application. The performance testing methodology consists the following phases: Planning phase The objective of the planning phase is to evolve the test plan and strategy which involve the following activities: Study and review the application architecture Understand system functionality and flow Review hardware components Assess the usability of the performance testing tool Define expectations for transaction performance, and system behavior Define performance metrics for a number of parameters including server and network utilization Define a testing approach Define a test validation strategy Collect the workload profile Peak and off-peak time periods Total and concurrent number of users Key transactions, frequency and complexity Transaction mix with respective user profile Usage pattern and think time between transactions Data access intensity and usage of data Test suite development In this phase the following activities are covered: Set up the test environment Create and validate test scripts Create data population scripts Create transaction data pools for use during tests Setup performance monitoring process 2 Infosys White Paper
3 Performance test execution This phase consists of a number of benchmarking runs, and each run has more concurrent virtual users. It continues till the performance testing objectives are met or system breakpoint is reached. If the system breaks down because of the test environment infrastructure the test results need to be extrapolated. A typical benchmarking cycle will consist of the following steps: Executing performance test scenarios and monitor scripts Collecting test and monitoring results Validating the results Analysis and review Data collected from the performance tests is consolidated and analyzed. The steps include: Data review (monitoring logs) Performance break-up across servers Analysis of potential bottlenecks and scalability limitations Identification of system and infrastructure related issues Analysing system performance sensitivity with the workloads If the timeframe for issue resolution is considered, the execution cycle will be repeated after the issues have been resolved. Recommendations Creation of production profile in performance testing environment is almost impossible, it can only be simulated. The effectiveness and accuracy of the performance benchmarking exercise depends on the similarity of the workload simulated in test environment to that in the production environment. This section covers some of the best practices that can be adopted while performing the performance benchmarking exercise. Transaction scripting The performance benchmark test suite is developed by scripting selected critical transactions. These scripts when executed generate a virtual user in the system. This user performs actions or transactions recorded in the script. Various data sets can be provided to the scripts for use during the test execution to ensure the coverage of different data sets in various iterations of the transaction. To simulate the real time production scenario, sleep time is introduced in the scripts which helps execute transactions similar to real time. The sleep time corresponds to think time, data entry time and time between the iterations. A few points worth considering during transaction scripting for virtual users: Think time generation: Most tools provide the facility to generate think time (data entry time and time between the iterations) randomly, but this method generally makes the test profile random and the results can not be reproduced or repeated for verification. To avoid this scenario, think time can be scientifically generated. It can be calculated in advance to distribute the transaction profiles evenly for whole of the test duration or randomly generated and recorded in the first test. The recorded think time can be used in future tests to verify the repeatability of the test results. Transaction distribution: The test results for the steady state phase of the performance testing are considered for analysis and evaluation. The results of ramp-up (initial) and ramp-down (logout) phases are ignored. If the think time generation process is random, a major chunk of users might come into the system and start their transaction iterations at the same time which will bring about a spike in resource usage and skew the results. Also, infrequently occurring transactions that get executed in ramp up and ramp-down phases will be absent in the system throughput considered for analysis and evaluation. Control over the scientifically generated think time helps in devising the login behavior and transaction start-phase for virtual users. Users start the iterations gradually, achieve a system steady state and come out of the system once throughput of all the critical transactions has been met. This portrays a scenario that closely resembles the real-time usage of the application. Infosys White Paper 3
4 User session management: Scripts should be created in such a manner that all virtual users get injected gradually during the initial user ramp-up phase and remain in the system till the test is over even if their assigned iterations have been completed, which can be achieved by setting a large value for the think time towards the end of virtual user scripts. This helps maintain the ratio of active to inactive sessions in the system. Testing scenarios depend on the performance test requirements and real time transaction usage pattern. The script design should be according to the scenario being tested. Data pool selection: Data pool sets for transactions should be carefully selected. The transaction data usage should be in proportion to the real time usage of the data. If the real time system is supposed to perform an enquiry, the distribution of data in terms of different query parameters, data creation dates, range of data selected (example wild card enquiries or date range), data access intensity (rows to be returned for query), usage of list of values, repetition intensity of the same data or similar data in further transaction iterations should also be considered to simulate different types of enquiries and ensure a wider coverage of transaction scenarios. These parameters produce the workload profile and test results near to real time scenario. System configuration and setup The configuration and setup of the test environment should be similar to that of the production environment. A few factors that need to be considered on this front are: Load balancing scenario: Load balancing in the test environment should be the same as in the production environment. A scenario where users are distributed on a round robin basis might give rise to issues in the test environment. Users in real time execute multiple transactions in a random fashion, the transaction usage pattern and behavior do not remain constant over a period of time. In the simulation exercise, a transaction or a group of transactions are recorded in a virtual script and that script is executed with multiple users. A virtual user repeats the same activity in multiple iterations with different data sets. If virtual users or scripts with heavy transactions connect to any of the servers they will repeatedly send requests to the same server. Though the number of users will be balanced on multiple load balanced servers the load will still be imbalanced between multiple requests. Distribution of loaded data: The test environment should be pre-populated to a level which is comparable to production environment data for around 6 months or one year. The time duration, for which data needs to be loaded, depends on the data archive and purging requirements of the system and the expected system longevity. The prepopulated data should have distribution similar to that in production environment. Various business parameters and data distribution parameters need to be collected in advance or production profiling should be done to generate the data load scripts. If production data is available, the ideal scenario will be to replicate or clone the production data in the test environment. This helps in generating the response time and elapsed time of the transactions similar to that of the production environment. Placement of data on disks: If data population scripts are used for loading data in the test environment and the data is loaded sequentially in the tables, data related to specific tables will be populated on the same disk. While performing the transactions on these tables or related tables, either the data is accessed quickly (due to operating system caching mechanism) or the disk becomes a bottleneck (due to excessive I/Os on specific disks). This might not be the case for the real time scenario as multiple transactions are performed during a day and the data for a specific transaction gets created gradually on a daily basis. The data is distributed automatically on various disks and the probability of the disk being a bottleneck is reduced. The placement of data on disks should be as close to production as possible. The ideal scenario will be to clone the production data in the test environment keeping the disk structure similar to that in the production environment. 4 Infosys White Paper
5 Load injector machine setup Virtual user scripts contain a single transaction or a group of similar transactions which are generally executed sequentially. These scripts are configured to be executed on load injector machines with a designated number of iterations as per the system workload profile. Factors that need to be considered while configuring the load injector machine and virtual scripts: Data caching: The virtual user script for a single transaction with assigned number of users is configured on a single load injector machine. The data pool or pre-requisite data for the multiple iterations of the transaction can be different (as configured in the data pool) but the requests from all the assigned users for the candidate script are generated from the single load injector box. If the load injector box caches transaction specific static data the response time of further iterations of the candidate transaction will be reduced and will not portray the real time scenario. To avoid issues with data caching the machine should be set-up properly or the distribution of the virtual user script and transactions should be in place. Distribution of IP addresses on the load injector: Multiple transaction scripts and virtual users configured on load injector machines use the same IP address for execution. This means that all virtual users will be using the same IP subnet mask. In a real time scenario only a few of the users will be using the same subnet mask. The test results in this scenario might differ from the real time results. Tools are available which can assign multiple IP addresses to different virtual users assigned on the same load injector box, this helps generate a better picture of the real time scenario. Bandwidth simulation: The performance benchmarking exercise is normally performed on the server side. The load controller and injector machines are connected to the servers with high speed LANs, network congestion and delays in transactions are not taken into account. Tools are available to define the usage of the network bandwidth by virtual users on load injector machines. If the system is locally deployed or de-centralized, the benchmark tests should be performed with real time network bandwidth. And, if the system is globally rolled out, with multiple sites injecting load on the system at the same time, the transaction load from various sites should preferably be allocated to different load injector machines. These site specific machines should be simulated with the network bandwidth available at respective sites. The pre-requisite for this scenario will be the availability of network topology of various production sites. This will given an accurate picture of the network congestion and delays for the system. Memory requirements of virtual users: All the users assigned to a load injector box require memory to execute a transaction. The maximum memory consumption by a single virtual user should be calculated in advance and distributed accordingly, to avoid congestion on client machines. Not more than 70-80% of the available memory on load injector machine should be consumed. Execution and analysis Factors that need to be considered for execution and analysis of test results: Validation of transaction response: Usually the data used during tests are different from the data used during recording or creating a transaction script which means response at various steps in a transaction script can be different from what is expected while recording or creating. To overcome this issue, the virtual user scripts should be devised in such a way that the response from the request is validated for success and further progress. The next step in a transaction should be executed depending on the response received from servers. Profiling of transaction steps: Transactions can be further divided into multiple logical steps and these steps can also be monitored for response or elapsed time. If the transaction response or elapsed time is above the expected response time these logical steps will help provide a better picture to understand bottlenecks in the transactions. The problematic steps can be further looked into for removal of performance issues. Validating execution of transactions: After the benchmark test, the simulation tool provides the summary of test results including transaction response time, transaction throughput etc. The throughput mentioned in the test results summary is based on the execution of the steps mentioned in the virtual script. There could be a scenario where the transaction fails but the virtual user step gets executed, this will show up in the test results summary. The best way of verifying the transaction throughput is to verify the occurrence of the transaction from the database or the affected areas. This will be easy if the transaction is performing create, update or delete operations. For read operations, the opening of the screen can be logged somewhere so that the transaction occurrence or success can be verified. Infosys White Paper 5
6 Transaction response time: Generally the transaction response times are presented as 90th percentile transaction response time. This means that 90% of the total transaction response time occurrences are below the figure marked as 90 th percentile figure for the candidate transaction. This is better than presenting the average response time figure, which averages out the peaks and variations in response times. Sometimes even 90 th percentile figure does not portray the correct picture and further analysis should be carried out before presenting the results summary. For example, if a transaction that has very low iterations (2 per test duration) and if one of the iterations happened during the initial stage or when the system was not performing well, the 90th percentile response time will be the response time for the mentioned iteration and that will be very high. In this scenario, we can ignore the mentioned iteration with proper justification and consider the response time of the 2 nd iteration. Similarly if the frequency of transactions is high and 90th percentile response time is quite above the expected response time, the difference between 85 th percentile and 90 th percentile response time should also be checked. If the difference is not much the transaction has performance issues, but if the difference is very high the iteration occurrences should be checked and the reasons should be examined. It is observed that usually the transaction is performing well but there are apparent issues in the system when the iterations for the candidate transactions are performed. Conclusion Performance benchmarking, pre-deployment, is an increasingly prevalent practice. It is an essential step in ensuring nonfunctional requirements are satisfied. However, performance benchmarking isn t a trivial exercise. It requires extensive and detailed planning. There are a number of best practices that have been developed over the years. Following these best practices ensures that you get the best results from the exercise. When properly executed, it eliminates rework post-deployment, thus reducing the cost of an application. About the Author Shikhar Puri, a Senior Technical Architect with Infosys. He has 8 years of IT experience. His areas of interest are predominantly in the performance engineering domain concentrating on strategies & solutions for defining and satisfying non-functional requirements, workload modeling, system sizing, performance tuning & testing and simulation of enterprise systems.
Copyright www.agileload.com 1
Copyright www.agileload.com 1 INTRODUCTION Performance testing is a complex activity where dozens of factors contribute to its success and effective usage of all those factors is necessary to get the accurate
Application Performance Testing Basics
Application Performance Testing Basics ABSTRACT Todays the web is playing a critical role in all the business domains such as entertainment, finance, healthcare etc. It is much important to ensure hassle-free
How To Test On The Dsms Application
Performance Test Summary Report Skills Development Management System December 2014 Performance Test report submitted to National Skill Development Corporation Version Date Name Summary of Changes 1.0 22/12/2014
How To Test For Elulla
EQUELLA Whitepaper Performance Testing Carl Hoffmann Senior Technical Consultant Contents 1 EQUELLA Performance Testing 3 1.1 Introduction 3 1.2 Overview of performance testing 3 2 Why do performance testing?
Performance Testing of Java Enterprise Systems
Performance Testing of Java Enterprise Systems Katerina Antonova, Plamen Koychev Musala Soft Why Performance Testing? Recent studies by leading USA consultancy companies showed that over 80% of large corporations
Performance Testing Percy Pari Salas
Performance Testing Percy Pari Salas Presented by : Percy Pari Salas Agenda What is performance testing? Types of performance testing What does performance testing measure? Where does performance testing
Performance Testing. What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing
Performance Testing What is performance testing? Why is performance testing necessary? Performance Testing Methodology EPM Performance Testing What is Performance Testing l The primary goal of Performance
Tableau Server Scalability Explained
Tableau Server Scalability Explained Author: Neelesh Kamkolkar Tableau Software July 2013 p2 Executive Summary In March 2013, we ran scalability tests to understand the scalability of Tableau 8.0. We wanted
PERFORMANCE TESTING. New Batches Info. We are ready to serve Latest Testing Trends, Are you ready to learn.?? START DATE : TIMINGS : DURATION :
PERFORMANCE TESTING We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info START DATE : TIMINGS : DURATION : TYPE OF BATCH : FEE : FACULTY NAME : LAB TIMINGS : Performance
Case Study - I. Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008.
Case Study - I Industry: Social Networking Website Technology : J2EE AJAX, Spring, MySQL, Weblogic, Windows Server 2008 Challenges The scalability of the database servers to execute batch processes under
Load Testing Analysis Services Gerhard Brückl
Load Testing Analysis Services Gerhard Brückl About Me Gerhard Brückl Working with Microsoft BI since 2006 Mainly focused on Analytics and Reporting Analysis Services / Reporting Services Power BI / O365
Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010
Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010 This document is provided as-is. Information and views expressed in this document, including URL and other Internet
Performance Testing. Slow data transfer rate may be inherent in hardware but can also result from software-related problems, such as:
Performance Testing Definition: Performance Testing Performance testing is the process of determining the speed or effectiveness of a computer, network, software program or device. This process can involve
Improve Business Productivity and User Experience with a SanDisk Powered SQL Server 2014 In-Memory OLTP Database
WHITE PAPER Improve Business Productivity and User Experience with a SanDisk Powered SQL Server 2014 In-Memory OLTP Database 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com Table of Contents Executive
The Association of System Performance Professionals
The Association of System Performance Professionals The Computer Measurement Group, commonly called CMG, is a not for profit, worldwide organization of data processing professionals committed to the measurement
Noelle A. Stimely Senior Performance Test Engineer. University of California, San Francisco [email protected]
Noelle A. Stimely Senior Performance Test Engineer University of California, San Francisco [email protected] Who am I? Senior Oracle Database Administrator for over 13 years Senior Performance Test
Cloud Based Application Architectures using Smart Computing
Cloud Based Application Architectures using Smart Computing How to Use this Guide Joyent Smart Technology represents a sophisticated evolution in cloud computing infrastructure. Most cloud computing products
WITH A FUSION POWERED SQL SERVER 2014 IN-MEMORY OLTP DATABASE
WITH A FUSION POWERED SQL SERVER 2014 IN-MEMORY OLTP DATABASE 1 W W W. F U S I ON I O.COM Table of Contents Table of Contents... 2 Executive Summary... 3 Introduction: In-Memory Meets iomemory... 4 What
Web Application s Performance Testing
Web Application s Performance Testing B. Election Reddy (07305054) Guided by N. L. Sarda April 13, 2008 1 Contents 1 Introduction 4 2 Objectives 4 3 Performance Indicators 5 4 Types of Performance Testing
Performance Test Process
A white Success The performance testing helped the client identify and resolve performance bottlenecks which otherwise crippled the business. The ability to support 500 concurrent users was a performance
Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition
Liferay Portal Performance Benchmark Study of Liferay Portal Enterprise Edition Table of Contents Executive Summary... 3 Test Scenarios... 4 Benchmark Configuration and Methodology... 5 Environment Configuration...
Performance Testing Process A Whitepaper
Process A Whitepaper Copyright 2006. Technologies Pvt. Ltd. All Rights Reserved. is a registered trademark of, Inc. All other trademarks are owned by the respective owners. Proprietary Table of Contents
Performance And Scalability In Oracle9i And SQL Server 2000
Performance And Scalability In Oracle9i And SQL Server 2000 Presented By : Phathisile Sibanda Supervisor : John Ebden 1 Presentation Overview Project Objectives Motivation -Why performance & Scalability
Enterprise Application Performance Management: An End-to-End Perspective
SETLabs Briefings VOL 4 NO 2 Oct - Dec 2006 Enterprise Application Performance Management: An End-to-End Perspective By Vishy Narayan With rapidly evolving technology, continued improvements in performance
How To Test For Performance
: Roles, Activities, and QA Inclusion Michael Lawler NueVista Group 1 Today s Agenda Outline the components of a performance test and considerations Discuss various roles, tasks, and activities Review
Monitor and Manage Your MicroStrategy BI Environment Using Enterprise Manager and Health Center
Monitor and Manage Your MicroStrategy BI Environment Using Enterprise Manager and Health Center Presented by: Dennis Liao Sales Engineer Zach Rea Sales Engineer January 27 th, 2015 Session 4 This Session
Performance Tuning and Optimizing SQL Databases 2016
Performance Tuning and Optimizing SQL Databases 2016 http://www.homnick.com [email protected] +1.561.988.0567 Boca Raton, Fl USA About this course This four-day instructor-led course provides students
Performance Testing of a Large Wealth Management Product
Performance Testing of a Large Wealth Management Product Meherphani Nori & Global Head Quality Assurance Krishna Kankipati & Vice President Mohan Pujari & Product Specialist Broadridge Financial Solutions
Performance Workload Design
Performance Workload Design The goal of this paper is to show the basic principles involved in designing a workload for performance and scalability testing. We will understand how to achieve these principles
Web Application Hosting Cloud Architecture
Web Application Hosting Cloud Architecture Executive Overview This paper describes vendor neutral best practices for hosting web applications using cloud computing. The architectural elements described
Windows Server 2008 R2 Hyper-V Live Migration
Windows Server 2008 R2 Hyper-V Live Migration White Paper Published: August 09 This is a preliminary document and may be changed substantially prior to final commercial release of the software described
White paper: Unlocking the potential of load testing to maximise ROI and reduce risk.
White paper: Unlocking the potential of load testing to maximise ROI and reduce risk. Executive Summary Load testing can be used in a range of business scenarios to deliver numerous benefits. At its core,
Performance Test Results Report for the Sled player
Performance Test Results Report for the Sled player The Open University Created: 17 th April 2007 Author Simon Hutchinson The Open University Page 1 of 21 Cross References None
Table of Contents INTRODUCTION... 3. Prerequisites... 3 Audience... 3 Report Metrics... 3
Table of Contents INTRODUCTION... 3 Prerequisites... 3 Audience... 3 Report Metrics... 3 IS MY TEST CONFIGURATION (DURATION / ITERATIONS SETTING ) APPROPRIATE?... 4 Request / Response Status Summary...
Informatica Data Director Performance
Informatica Data Director Performance 2011 Informatica Abstract A variety of performance and stress tests are run on the Informatica Data Director to ensure performance and scalability for a wide variety
Load/Performance Test Plan
[Project_name_here] Load/Performance Test Plan Version [Version_number] Author: [Your_name_here] [Your_Company_name] [Street_name_1] [Street_name_2] [City_Zip_Country] [Phone_number] [URL] Audit Trail:
CLOUD PERFORMANCE TESTING - KEY CONSIDERATIONS (COMPLETE ANALYSIS USING RETAIL APPLICATION TEST DATA)
CLOUD PERFORMANCE TESTING - KEY CONSIDERATIONS (COMPLETE ANALYSIS USING RETAIL APPLICATION TEST DATA) Abhijeet Padwal Performance engineering group Persistent Systems, Pune email: [email protected]
Business Application Services Testing
Business Application Services Testing Curriculum Structure Course name Duration(days) Express 2 Testing Concept and methodologies 3 Introduction to Performance Testing 3 Web Testing 2 QTP 5 SQL 5 Load
Developing a Load Testing Strategy
Developing a Load Testing Strategy Michele Ruel St.George Bank CMGA 2005 Page 1 Overview... 3 What is load testing?... 4 Scalability Test... 4 Sustainability/Soak Test... 4 Comparison Test... 4 Worst Case...
Delivering Quality in Software Performance and Scalability Testing
Delivering Quality in Software Performance and Scalability Testing Abstract Khun Ban, Robert Scott, Kingsum Chow, and Huijun Yan Software and Services Group, Intel Corporation {khun.ban, robert.l.scott,
BENCHMARKING CLOUD DATABASES CASE STUDY on HBASE, HADOOP and CASSANDRA USING YCSB
BENCHMARKING CLOUD DATABASES CASE STUDY on HBASE, HADOOP and CASSANDRA USING YCSB Planet Size Data!? Gartner s 10 key IT trends for 2012 unstructured data will grow some 80% over the course of the next
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance
Case Study: Load Testing and Tuning to Improve SharePoint Website Performance Abstract: Initial load tests revealed that the capacity of a customized Microsoft Office SharePoint Server (MOSS) website cluster
Distributed applications monitoring at system and network level
Distributed applications monitoring at system and network level Monarc Collaboration 1 Abstract Most of the distributed applications are presently based on architectural models that don t involve real-time
Bringing Value to the Organization with Performance Testing
Bringing Value to the Organization with Performance Testing Michael Lawler NueVista Group 1 Today s Agenda Explore the benefits of a properly performed performance test Understand the basic elements of
WHAT WE NEED TO START THE PERFORMANCE TESTING?
ABSTRACT Crystal clear requirements before starting an activity are always helpful in achieving the desired goals. Achieving desired results are quite difficult when there is vague or incomplete information
Introduction. Part I: Finding Bottlenecks when Something s Wrong. Chapter 1: Performance Tuning 3
Wort ftoc.tex V3-12/17/2007 2:00pm Page ix Introduction xix Part I: Finding Bottlenecks when Something s Wrong Chapter 1: Performance Tuning 3 Art or Science? 3 The Science of Performance Tuning 4 The
Course Outline. ttttttt
10967 - Fundamentals of a Windows Server Infrastructure General Description Learn the fundamental knowledge and skills that you need to build a Windows Server infrastructure with Windows Server 2012. This
How To Test A Web Server
Performance and Load Testing Part 1 Performance & Load Testing Basics Performance & Load Testing Basics Introduction to Performance Testing Difference between Performance, Load and Stress Testing Why Performance
Load Testing and Monitoring Web Applications in a Windows Environment
OpenDemand Systems, Inc. Load Testing and Monitoring Web Applications in a Windows Environment Introduction An often overlooked step in the development and deployment of Web applications on the Windows
SQL Server Performance Tuning and Optimization
3 Riverchase Office Plaza Hoover, Alabama 35244 Phone: 205.989.4944 Fax: 855.317.2187 E-Mail: [email protected] Web: www.discoveritt.com SQL Server Performance Tuning and Optimization Course: MS10980A
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist
Informatica Master Data Management Multi Domain Hub API: Performance and Scalability Diagnostics Checklist 2012 Informatica Corporation. No part of this document may be reproduced or transmitted in any
Unprecedented Performance and Scalability Demonstrated For Meter Data Management:
Unprecedented Performance and Scalability Demonstrated For Meter Data Management: Ten Million Meters Scalable to One Hundred Million Meters For Five Billion Daily Meter Readings Performance testing results
SOFTWARE PERFORMANCE TESTING SERVICE
SOFTWARE PERFORMANCE TESTING SERVICE Service Definition GTS s performance testing services allows customers to reduce the risk of poor application performance. This is done by performance testing applications
Everything you need to know about flash storage performance
Everything you need to know about flash storage performance The unique characteristics of flash make performance validation testing immensely challenging and critically important; follow these best practices
Windows Server 2008 R2 Hyper-V Live Migration
Windows Server 2008 R2 Hyper-V Live Migration Table of Contents Overview of Windows Server 2008 R2 Hyper-V Features... 3 Dynamic VM storage... 3 Enhanced Processor Support... 3 Enhanced Networking Support...
BW-EML SAP Standard Application Benchmark
BW-EML SAP Standard Application Benchmark Heiko Gerwens and Tobias Kutning (&) SAP SE, Walldorf, Germany [email protected] Abstract. The focus of this presentation is on the latest addition to the
Boost your VDI Confidence with Monitoring and Load Testing
White Paper Boost your VDI Confidence with Monitoring and Load Testing How combining monitoring tools and load testing tools offers a complete solution for VDI performance assurance By Adam Carter, Product
70-646 R3: Windows Server 2008 Administration. Course Overview. Course Outline. Course Length: 4 Day
70-646 R3: Windows Server 2008 Administration Course Length: 4 Day Course Overview This course will prepare the student for Exam 70-646: Pro: Windows Server 2008, Server Administrator. Topics covered include
What Is Specific in Load Testing?
What Is Specific in Load Testing? Testing of multi-user applications under realistic and stress loads is really the only way to ensure appropriate performance and reliability in production. Load testing
MEASURING WORKLOAD PERFORMANCE IS THE INFRASTRUCTURE A PROBLEM?
MEASURING WORKLOAD PERFORMANCE IS THE INFRASTRUCTURE A PROBLEM? Ashutosh Shinde Performance Architect [email protected] Validating if the workload generated by the load generating tools is applied
Managing and Maintaining Windows Server 2008 Servers
Managing and Maintaining Windows Server 2008 Servers Course Number: 6430A Length: 5 Day(s) Certification Exam There are no exams associated with this course. Course Overview This five day instructor led
Performance characterization report for Microsoft Hyper-V R2 on HP StorageWorks P4500 SAN storage
Performance characterization report for Microsoft Hyper-V R2 on HP StorageWorks P4500 SAN storage Technical white paper Table of contents Executive summary... 2 Introduction... 2 Test methodology... 3
Performance Testing Uncovered
Performance Testing Uncovered First Presented at: NobleStar Systems Corp. London, UK 26 Sept. 2003 Scott Barber Chief Technology Officer PerfTestPlus, Inc. Performance Testing Uncovered Page 1 Performance
Fixed Price Website Load Testing
Fixed Price Website Load Testing Can your website handle the load? Don t be the last one to know. For as low as $4,500, and in many cases within one week, we can remotely load test your website and report
Performance Testing. Nov 2011
Performance Testing Nov 2011 Load testing tools have been around for more than a decade but performance testing practices adopted by the IT industry are still far from maturity. There are still severe
White Paper. Recording Server Virtualization
White Paper Recording Server Virtualization Prepared by: Mike Sherwood, Senior Solutions Engineer Milestone Systems 23 March 2011 Table of Contents Introduction... 3 Target audience and white paper purpose...
Testing Big data is one of the biggest
Infosys Labs Briefings VOL 11 NO 1 2013 Big Data: Testing Approach to Overcome Quality Challenges By Mahesh Gudipati, Shanthi Rao, Naju D. Mohan and Naveen Kumar Gajja Validate data quality by employing
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations. Database Solutions Engineering
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations A Dell Technical White Paper Database Solutions Engineering By Sudhansu Sekhar and Raghunatha
Accelerate Testing Cycles With Collaborative Performance Testing
Accelerate Testing Cycles With Collaborative Performance Testing Sachin Dhamdhere 2005 Empirix, Inc. Agenda Introduction Tools Don t Collaborate Typical vs. Collaborative Test Execution Some Collaborative
Databases Going Virtual? Identifying the Best Database Servers for Virtualization
Identifying the Best Database Servers for Virtualization By Confio Software Confio Software 4772 Walnut Street, Suite 100 Boulder, CO 80301 www.confio.com Many companies are turning to virtualization in
Database Server Configuration Best Practices for Aras Innovator 10
Database Server Configuration Best Practices for Aras Innovator 10 Aras Innovator 10 Running on SQL Server 2012 Enterprise Edition Contents Executive Summary... 1 Introduction... 2 Overview... 2 Aras Innovator
How To Model A System
Web Applications Engineering: Performance Analysis: Operational Laws Service Oriented Computing Group, CSE, UNSW Week 11 Material in these Lecture Notes is derived from: Performance by Design: Computer
Optimization of Cluster Web Server Scheduling from Site Access Statistics
Optimization of Cluster Web Server Scheduling from Site Access Statistics Nartpong Ampornaramveth, Surasak Sanguanpong Faculty of Computer Engineering, Kasetsart University, Bangkhen Bangkok, Thailand
SUNYIT. Reaction Paper 2. Measuring the performance of VoIP over Wireless LAN
SUNYIT Reaction Paper 2 Measuring the performance of VoIP over Wireless LAN SUBMITTED BY : SANJEEVAKUMAR 10/3/2013 Summary of the Paper The paper s main goal is to compare performance of VoIP in both LAN
Fundamentals of a Windows Server Infrastructure Course 10967A; 5 Days, Instructor-led
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Fundamentals of a Windows Server Infrastructure Course 10967A; 5 Days, Instructor-led
Performance Testing. on Production System
Performance Testing on Production System Abstract Performance testing is conducted to check whether the target application will be able to meet the real users expectations in the production environment
Global Server Load Balancing
White Paper Overview Many enterprises attempt to scale Web and network capacity by deploying additional servers and increased infrastructure at a single location, but centralized architectures are subject
STeP-IN SUMMIT 2013. June 18 21, 2013 at Bangalore, INDIA. Enhancing Performance Test Strategy for Mobile Applications
STeP-IN SUMMIT 2013 10 th International Conference on Software Testing June 18 21, 2013 at Bangalore, INDIA Enhancing Performance Test Strategy for Mobile Applications by Nikita Kakaraddi, Technical Lead,
Microsoft SQL Server: MS-10980 Performance Tuning and Optimization Digital
coursemonster.com/us Microsoft SQL Server: MS-10980 Performance Tuning and Optimization Digital View training dates» Overview This course is designed to give the right amount of Internals knowledge and
Aras Innovator 10 Scalability Benchmark Methodology and Performance Results
Aras Innovator 10 Scalability Benchmark Methodology and Performance Results Aras Innovator 10 Running on SQL Server 2012 Enterprise Edition Contents Executive Summary... 1 Introduction... 2 About Aras...
VirtualCenter Database Performance for Microsoft SQL Server 2005 VirtualCenter 2.5
Performance Study VirtualCenter Database Performance for Microsoft SQL Server 2005 VirtualCenter 2.5 VMware VirtualCenter uses a database to store metadata on the state of a VMware Infrastructure environment.
Keys to Optimizing Your Backup Environment: Tivoli Storage Manager
Keys to Optimizing Your Backup Environment: Tivoli Storage Manager John Merryman GlassHouse Technologies, Inc. Introduction Audience Profile Storage Management Interdependence Backup Pain Points Architecture
Application. Performance Testing
Application Performance Testing www.mohandespishegan.com شرکت مهندش پیشگان آزمون افسار یاش Performance Testing March 2015 1 TOC Software performance engineering Performance testing terminology Performance
Tools for Testing Software Architectures. Learning Objectives. Context
Tools for Testing Software Architectures Wolfgang Emmerich Professor of Distributed Computing University College London http://sse.cs.ucl.ac.uk Learning Objectives To discuss tools to validate software
IBM RATIONAL PERFORMANCE TESTER
IBM RATIONAL PERFORMANCE TESTER Today, a major portion of newly developed enterprise applications is based on Internet connectivity of a geographically distributed work force that all need on-line access
Datasheet iscsi Protocol
Protocol with DCB PROTOCOL PACKAGE Industry s premiere validation system for SAN technologies Overview Load DynamiX offers SCSI over TCP/IP transport () support to its existing powerful suite of file,
ENOVIA V6 Architecture Performance Capability Scalability
ENOVIA V6 Technical Advantages Whitepaper ENOVIA V6 Architecture Performance Capability Scalability a Product Lifecycle Management Whitepaper Prepared by ENOVIA, a Dassault Systèmes Brand Executive Summary
An Oracle White Paper July 2011. Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide
Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide An Oracle White Paper July 2011 1 Disclaimer The following is intended to outline our general product direction.
APPLICATION PERFORMANCE TESTING IN A VIRTUAL ENVIRONMENT
APPLICATION PERFORMANCE TESTING IN A VIRTUAL ENVIRONMENT A REPORT FROM SAP CO-INNOVATION LAB : Joerg Nalik Shunra Software: Dave Berg HP Software: Mark Tomlinson 1.0 Table of Contents 1 Introduction Raising
There are a number of factors that increase the risk of performance problems in complex computer and software systems, such as e-commerce systems.
ASSURING PERFORMANCE IN E-COMMERCE SYSTEMS Dr. John Murphy Abstract Performance Assurance is a methodology that, when applied during the design and development cycle, will greatly increase the chances
An Oracle White Paper March 2013. Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite
An Oracle White Paper March 2013 Load Testing Best Practices for Oracle E- Business Suite using Oracle Application Testing Suite Executive Overview... 1 Introduction... 1 Oracle Load Testing Setup... 2
An introduction to load testing for Web applications. Business white paper
An introduction to load testing for Web applications Business white paper Table of contents Introduction...3 Grow your business through online exposure...3 Application performance testing prior to going
Analyzing Big Data with Splunk A Cost Effective Storage Architecture and Solution
Analyzing Big Data with Splunk A Cost Effective Storage Architecture and Solution Jonathan Halstuch, COO, RackTop Systems [email protected] Big Data Invasion We hear so much on Big Data and
Load Testing Scenarios Selection
Load Testing Scenarios Selection Abstract The purpose of Load testing is to identify and resolve all the application performance bottlenecks before they affect the application real users. If a web application
Dell Compellent Storage Center SAN & VMware View 1,000 Desktop Reference Architecture. Dell Compellent Product Specialist Team
Dell Compellent Storage Center SAN & VMware View 1,000 Desktop Reference Architecture Dell Compellent Product Specialist Team THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL
