Functionality & Performance Testing Analysis Services 2005 with Teradata v12
|
|
|
- Victoria Gallagher
- 10 years ago
- Views:
Transcription
1 Functionality & Performance Testing Analysis Services 2005 with Teradata v12 Rupal Shah Teradata Corporation James Basilico Microsoft Corporation Rev. 1.0 July, 2008
2 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication. This White Paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation. Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property Microsoft Corporation. All rights reserved. Microsoft, Windows, and Windows Server are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. All other trademarks are property of their respective owners.
3 Table of Contents Scope... 4 Introduction... 5 Lab Environment Design and Configuration... 7 Functionality Testing of Teradata v12.0 with Analysis Services Performance Results Against Teradata Database v2r User Load Test Results User Load Test Results User Load Test Results User Load Test Results Comparison of User Load Test Results using Teradata Database v2r Performance Results Against Teradata Database v User Load Test Results User Load Test Results User Load Test Results User Load Test Results Comparison of User Load Test Results using Teradata Database v Conclusions Follow-On Performance Testing Efforts Appendix A: Applicable Links Appendix B: Analysis Services Stress Testing Toolset Query Generation through ASQueryGen Client Simulation through ASLoadSim Stress Test Reports and Counters Used For more information: Microsoft Corporation. All rights reserved Page 3 of 30
4 Scope This paper describes the results from functionality and basic performance testing of SQL Server 2005 Analysis Services leveraging Teradata v12. This paper is targeted at Business Intelligence (BI) administrators, system integrators and database administrators. Readers are expected to have a basic understanding of the features of the Teradata Database Aggregate Join Index and Microsoft SQL Server 2005 Analysis Services products Microsoft Corporation. All rights reserved Page 4 of 30
5 Introduction The goal of this paper is to document lab environment efforts identifying the key architectural design strategies of a combined Teradata Relational Database and Analysis Services solution in a ROLAP environment. The objective of the lab environment tests are to benchmark and understand the.net Data Provider for Teradata v1.02 and Analysis Services 2005 components with respect to functionality, throughput requests and response time. Lab environment exercises that were performed focused on preliminary benchmark tests using a 100GB database, and then tuned the system to get optimal results based both on a Teradata Database v2r and Teradata v12.0 backend using the.net Data Provider for Teradata v12.0 in both cases. Also part of this testing was to compare these results to the previous benchmark results conducted by the Microsoft Technology Center (MTC) in July 2007, which used Teradata Database v2r and.net Data Provider for Teradata v1.2 with Analysis Services As mentioned already, this effort leveraged the Teradata Database v2r environment that was constructed for the original MTC testing that was completed in July The original testing focused on optimizing the Project REAL ( warehouse with a goal of identifying the optimal hardware configuration and best practices for a ROLAP environment. The scope of that effort was as follows: Database preparation o o o o o Migration of the Project REAL sample data kit to a Teradata system provided by Teradata. Identify an appropriate database schema for Project REAL analytics based on the POC objectives. Redesign the Project REAL solution into a Snow-Flake schema to more effectively take advantages of Teradata optimization capabilities. Grow the data to 100GB. Tune the Teradata system for the user load. For our effort, we focused on the following: Functionality testing of Analysis Services running against Teradata Database v12.0 and.net Data Provider for Teradata v12.0. Test script preparation leveraging Visual Studio Test Suite for Testers. Benchmark and optimize the Project REAL solution for user load against Teradata Database v2r using the new.net Data Provider for Teradata v12.0. Database migration from Teradata Database v2r to Teradata Database v12.0 Benchmark and optimize the Project REAL solution for user load against Teradata Database v12.0 using the new.net Data Provider for Teradata v Microsoft Corporation. All rights reserved Page 5 of 30
6 The successful execution of the engagement resulted in the following benchmark results (which was based on SQL Server 2005 Analysis Services SP2): Ran various user loads up to 1000 users with response times less than 1.2 second for tests against the Teradata Database v2r engine using the.net Data Provider for Teradata v12.0. Ran various user loads up to 1000 users with response times less than 0.93 second for tests against the Teradata Database v12.0 engine using the.net Data Provider for Teradata v12.0. The hardware configuration is as follows: Analysis Services System: DELL two socket, quad-core system (eight total processor cores) with 1.86 GHz Intel Xeon and 4GB RAM 1 TB drive - The Analysis Services System does not require disk storage space for a ROLAP storage deployment. In a Teradata scenario, a best practice is to store the dimensions as MOLAP as a result an insignificant amount of disk space was used. Windows 2003 Server X64 SQL Server 2005 x64 SP2 (patch level 3175) Teradata System: 4-Node Teradata Database 5500 system running v2r , which was also then upgraded to v12.0. Disks: RAID1 with 96 x 73GB disks Each node has: 12 AMPs and 2 PE's Ghz 8GB RAM Linux/SUSE9 (64-bit) 2008 Microsoft Corporation. All rights reserved Page 6 of 30
7 Lab Environment Design and Configuration For the tests described within this paper, the lab environment leveraged the great work that was completed in July 2007 by the Chicago MTC and Teradata personnel. Instead of needing to setup the Teradata environment or collecting representative queries to benchmark against, the lab exercises could focus on functionality testing of v12.0, and benchmarking results of the new.net Data Provider for Teradata v12.0 (against v2r and v12.0). The following is a summary of the activities performed during the course of the original efforts in July The exact order and level of parallel effort may not be directly expressed in this summary. Database Preparation: Migrated sample data from the Project REAL kit into a Teradata system provided by Teradata. This was accomplished at the Microsoft Technology Center in Chicago through remote access to a Teradata system that was located at the Rancho Bernardo Center in San Diego. Used Teradata Administration tools to migrate the schema and SQL Server Integration Services to load the data. Identified a set of Reporting Services reports for this activity and with these reports, identified the tables and columns that were being queried. Migrated the Project REAL Star Schema into a Snow-Flake and tuned it for the Teradata System. Star schemas would perform well, but would depend on volume of data, cardinality and size of Teradata Database system. As well as, customer objectives and Aggregate Join Index (AJI) 1 design for analytics. For our activities, performed the following database tuning and considerations mentioned below: o o o o o o Created additional tables to support snowflake dimensions Ensured all dimension table primary keys are defined as unique utilizing the UNIQUE constraint, or the primary key is defined as a UNIQUE PRIMARY INDEX. Ensured all UNIQUE PRIMARY INDEX (UPI) columns are defined with NOT NULL Ensured (where possible) all primary and foreign keys are on the ID and not Name or Description columns. This will result in a smaller AJI, which means faster data access. Ensured single level dimensions have supporting reference/lookup/dimension table for optimal performance. Populated the snow-flake dimension tables with data (i.e. insert/select) 1 An AJI is an aggregated result set saved as an index in the database. It is transparent to an end-user and BI Administrators and will be used automatically by the Teradata optimizer when a query plan contains frequently made like columns and aggregates. Refer to the whitepaper in Appendix A Microsoft Corporation. All rights reserved Page 7 of 30
8 Ensured Fact Table design is wide 2 (i.e., columns for each dimension and measure) and modified Primary Index (PI) as a composite key of all nonmeasure columns Collected statistics on all primary key/foreign key relationship columns. Implemented Referential Integrity (RI) on the primary key/foreign key columns. RI can be defined with the check (a.k.a., hard RI) or no check option (a.k.a., soft RI). A Snow-Flake schema with the tuning mentioned above will result in smaller AJI which can produce faster query response versus a Star schema which may result in less database tuning activities, but will require all columns of interest to be included in the AJI definition. Customers should do their own due diligence to determine the best schema design and database tuning based on their objectives. Redesigned the Project REAL Analysis Services Solution on the Snow-Flake schema while maintaining a consistent matching database structure to facilitate the query requirements of the Reporting Services reports. Increased the size of the fact table to 100GB. The Project REAL sample kit contained a sample set of rows spanning 1 year and it was replicated for additional years to grow the fact data. The following dimension map identifies the Analysis Services and Reporting Services reporting requirement for our AJI strategy for performance. TIME STORE ITEM DEPARTMENT BUYER REPLEN MEASURE STRATEGY Year_Desc Division Product Purchase Department Buyer Strategy Sales_Qty Vendor Index Alpha Type Qtr_Desc Region Subject Purchase Buyer Strategy Sale_Amt Vendor Name Month_Desc District Category Title Type. Week_Desc City Sub. Category Date_Value Store_ Desc Item_Desc. Built 1 broad 3 AJI to support/cover the above reporting requirement (in yellow): CREATE JOIN INDEX ms.aji_store_sales,no FALLBACK,CHECKSUM = DEFAULT AS SELECT COUNT(*)(FLOAT, NAMED CountStar ), ms.f.buyer_alpha, ms.a.sk_dept_id, ms.e.strategy_ind, ms.b.purch_vendor_num, ms.b.dept, ms.b.subcategory_code, ms.c.city, ms.d.calendar_year_desc, 2 A wide Fact table design has the measure and dimension columns in it with the dimension columns being used in the Composite Primary Key of the fact table. Refer to the whitepaper in Appendix A.. 3 The broad AJI constitutes as an index which covers all dimensions and levels of interest based on the aforementioned physical database stipulations to effectively address multi level dimensional hierarchy/model. Refer to the whitepaper mentioned earlier in this paper for more details Microsoft Corporation. All rights reserved Page 8 of 30
9 FROM ms.d.calendar_qtr_desc, ms.d.calendar_month_desc, ms.d.calendar_week_desc, SUM(ms.a.Sales_Qty )(FLOAT, NAMED Sales_Qty ), SUM(ms.a.Sale_Amt )(FLOAT, NAMED Sale_Amt ) ms.vtbl_fact_store_sales a, ms.vtbl_dim_item b, ms.vtbl_dim_store c, ms.vtbl_dim_date d, ms.vtbl_dim_model_strategy e, ms.vtbl_dim_buyer f WHERE ((((ms.a.sk_item_id = ms.b.sk_item_id ) AND (ms.a.sk_store_id = ms.c.sk_store_id )) AND (ms.a.sk_date_id = ms.d.sk_date_id )) AND (ms.a.sk_model_strategy_id = ms.e.sk_model_strategy_id )) AND (ms.a.sk_buyer_id = ms.f.sk_buyer_id ) GROUP BY ms.f.buyer_alpha, ms.a.sk_dept_id, ms.e.strategy_ind, ms.b.purch_vendor_num, ms.b.dept, ms.b.subcategory_code, ms.c.city, ms.d.calendar_year_desc, ms.d.calendar_qtr_desc, ms.d.calendar_month_desc, ms.d.calendar_week_desc PRIMARY INDEX (Buyer_Alpha,SK_Dept_ID,Strategy_Ind,Purch_Vendor_Num, Dept,Subcategory_Code,City,Calendar_Year_Desc,Calendar_Qtr_Desc, Calendar_Month_Desc,Calendar_Week_Desc ); Test Scripts preparation: Captured the MDX from the Project REAL reporting solution. Used a random query generation tool (ASLoadSim) to create a large set of queries and each query contained random parameter values. Additional lab design and configuration for our latest tests included the following. Database Preparation: Once the benchmarking against the Teradata Database v2r environment using the.net Data Provider for Teradata v12.0 was completed, the database environment was upgraded to Teradata Database v12.0. Benchmarking: Created a Visual Studio.NET 2005 Team Suite test project (using ASLoadSim) to simulate various Analysis Services OLAP user loads in a stepped fashion, with each load running for 3 minutes once the maximum user load was reached. Ran increasing user loads up to 1000 users, first against Teradata Database v2r environment and then against the v12.0 environment. Optimized the benchmark testing methods to apply stress to the.net Data Provider for Teradata as much as possible Microsoft Corporation. All rights reserved Page 9 of 30
10 Functionality Testing of Teradata v12.0 with Analysis Services 2005 The functionality tests successfully completed a series of Analysis Services development functions using a remote Analysis Services 2005 SP2 (build 3175) server connected to Teradata v12.0. These tests were conducted for initial functionality testing prior to the actual benchmarking runs. The functionality tests included the following areas: Functional Test 1 Ability to create an Analysis Services Data Source View (DSV) against Teradata Database v12.0 using the.net Data Provider for Teradata v12.0. Functional Test 2 Ability to process Analysis Services dimensions and cubes which were sourced from Teradata Database v12.0 using the.net Data Provider for Teradata v Microsoft Corporation. All rights reserved Page 10 of 30
11 Functional Test 3 Ability to query Analysis Services ROLAP cubes that are created against Teradata Database v12.0 using the.net Data Provider for Teradata v12.0. Functional Test 4 Ability to execute reports from Reporting Services 2005 which have their data source defined against Analysis Services cubes that have Teradata Database v12.0 data using the.net Data Provider for Teradata v Microsoft Corporation. All rights reserved Page 11 of 30
12 Functional Test 5 Ability to execute reports from Reporting Services Report Builder 2005, whereby the Report Model was generated from an Analysis Services 2005 ROLAP cube that has its data source connected to a Teradata Database v12.0 using the.net Data Provider for Teradata v Microsoft Corporation. All rights reserved Page 12 of 30
13 Performance Results Against Teradata Database v2r The following results are from the user stress tests against Teradata Database v2r using the.net Data Provider for Teradata v User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 100 Warm-Up Period and Step Intervals Think Time 10 seconds for both 2 seconds Test Duration (step time plus constant time) 50 seconds +3 minutes Avg % Processor Time 91 Avg Response Time 0.1 seconds Queries Executing on TD Microsoft Corporation. All rights reserved Page 13 of 30
14 At a 100 concurrent users with a Think Time of 2 seconds, a total of 812 tests were executed out of which 655 queries executed on the Teradata server. The Analysis Services machine was running at 91 Avg % Processor Time and the Avg Query Response Time was well under a second (0.10 sec). 200 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 200 Warm-Up Period and Step Intervals 10 seconds for both Think Time 2 seconds Test Duration (step time plus constant time) 100 seconds +3 minutes Avg % Processor Time 88 Avg Response Time 0.59 seconds Queries Executing on TD 1076 At 200 users, the number of total tests executed was 958 with 1076 tests resulting in queries executing at the Teradata server. As more tests were being executed, 2008 Microsoft Corporation. All rights reserved Page 14 of 30
15 Analysis Services Cache gets built up and most of the requests were being answered from the Cache. The Avg % CPU Time was 88 with an Avg Response Time of 0.59 seconds. 500 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 500 Warm-Up Period and Step Intervals 10 seconds for both Think Time 2 seconds Test Duration (step time plus constant time) 250 seconds +3 minutes Avg % Processor Time 88 Avg Response Time 0.72 seconds Queries Executing on TD 1722 At 500 users, the number of total tests executed was 913 with 1722 queries executing at the Teradata server. The CPU reached its threshold during various points of the test and was at 88%. Avg Response Time of queries increased considerably from around 0.10 seconds for the 100 user test to 0.72 seconds Microsoft Corporation. All rights reserved Page 15 of 30
16 1000 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 1000 Warm-Up Period and Step Intervals Think Time 10 seconds for both 2 seconds Test Duration (step time plus constant time) 500 seconds +3 minutes Avg % Processor Time 92 Avg Response Time Queries Executing on TD seconds At 1000 users, the number of total tests executed was 961 with 4088 queries executing at the Teradata server. The CPU reached its threshold during various points of the test and was at 92%. Avg Response Time of queries increased considerably from around 0.10 seconds for the 100 user test to 1.2 seconds. With increased workload of 1000 users, the CPU reaches threshold violations at 90% Microsoft Corporation. All rights reserved Page 16 of 30
17 Comparison of User Load Test Results using Teradata Database v2r Figure 5 Comparison of Teradata Database v2r User Load Test Results 2008 Microsoft Corporation. All rights reserved Page 17 of 30
18 Performance Results Against Teradata Database v12.0 The following results are from the user stress tests against Teradata Database v12.0 using the.net Data Provider for Teradata v User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 100 Warm-Up Period and Step Intervals Think Time 10 seconds for both 2 seconds Test Duration (step time plus constant time) 70 seconds +3 minutes Avg % Processor Time 92 Avg Response Time 0.10 seconds Queries Executing on TD 649 At a 100 concurrent users with a Think Time of 2 seconds, a total of 676 tests were executed out of which 649 queries executed on the Teradata server. The Analysis 2008 Microsoft Corporation. All rights reserved Page 18 of 30
19 Services machine was running at 92 Avg % Processor Time and the Avg Query Response Time was well under a second (0.10 sec). 200 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 200 Warm-Up Period and Step Intervals 10 seconds for both Think Time 2 seconds Test Duration (step time plus constant time) 100 seconds +3 minutes Avg % Processor Time Avg Response Time 0.26 seconds Queries Executing on TD 1091 At 200 users, the number of total tests executed was 719 with 1091 queries executing at the Teradata server. As more tests were being executed, Analysis Services Cache gets built up and most of the requests were being answered from the Cache. The Avg % CPU Time was with an Avg Response Time of 0.26 seconds Microsoft Corporation. All rights reserved Page 19 of 30
20 500 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 500 Warm-Up Period and Step Intervals Think Time 10 seconds for both 2 seconds Test Duration (step time plus constant time) 250 seconds +3 minutes Avg % Processor Time Avg Response Time Queries Executing on TD seconds At 500 users, the number of total tests executed was 700 with 1825 queries executing at the Teradata server. The CPU reached its threshold during various points of the test and was at 90.33%. Avg Response Time of queries increased considerably from around 0.26 seconds to 0.67 seconds Microsoft Corporation. All rights reserved Page 20 of 30
21 1000 User Load Test Results Figure User Load Test Results from within Visual Studio.NET 2005 User Load 1000 Warm-Up Period and Step Intervals Think Time 10 seconds for both 2 seconds Test Duration (step time plus constant time) 500 seconds +3 minutes Avg % Processor Time 90 Avg Response Time Queries Executing on TD seconds With increased workload of 1000 users, the CPU reached many threshold violations at 90%, and three tests failed Microsoft Corporation. All rights reserved Page 21 of 30
22 Comparison of User Load Test Results using Teradata Database v12.0 Figure 10 Comparison of Teradata Database v12.0 User Load Test Results 2008 Microsoft Corporation. All rights reserved Page 22 of 30
23 Conclusions The following are the drawn conclusions from this functional/performance test: 1. Analysis Services ROLAP cube to Teradata Database v12.0 outperformed all tests as compared to Teradata Database v2r with the.net Data Provider for Teradata v Except for the outlier result of the v2r result for 200 users, the Teradata Database v12.0 RDBMS platform outperformed all other results. (note also that the original v test may be need to be rerun do to unknown simulation settings). 3. As the number of users reaches 500, the average response time increases from just around.10 second for all test configurations (at 100 users) to.90 seconds in the worst case for 500 users. Therefore, a sweet-spot for concurrent user threshold (based upon average response time) would be around 500 users for consistent sub-second response time using same or similar hardware. 4. It also noteworthy to mention that during the tests (against all environments), Analysis Services began answering queries via its cache at some point and thus did not query the Teradata database after that point. This is a good occurrence for query performance usually, but with limited memory and fully utilized CPU resources during our tests, additional user load and/or improved average response times may have been observed otherwise. 5. For practical reasons, a more precise benchmarking test environment would have separated the SQL Server RDBMS which was used to record test results from the Analysis Services service. Add to that the actual testing tool also ran from this same server in many instances it was noticed that between the SQL Server relational database and the testing tool, CPU resources would exceed 75%, leaving only 25% to Analysis Services Microsoft Corporation. All rights reserved Page 23 of 30
24 Figure 11 Comparison of Teradata Database v2r to v12.0 User Load Test Results Follow-On Performance Testing Efforts As this paper describes the results from functionality and basic performance testing of SQL Server 2005 Analysis Services leveraging Teradata sources, there are additional high performance stress tests that would provide greater insight into throughput potential of the.net Data Provider for Teradata and Analysis Services. A future lab environment effort would use the above documented key learnings and optimizations to execute high performance stress tests. No date and/or team have been set yet for this potential phase 2 round of tests, but is expected within six months of this paper s publish date Microsoft Corporation. All rights reserved Page 24 of 30
25 Appendix A: Applicable Links The following links are provided as additional references to materials which are relevant to the discussions. Analysis Services section of the SQL Server Developer Center on MSDN Whitepaper/Code SQL Server Analysis Services 2005 Load Testing Best Practices This paper describes how you can leverage Visual Studio.NET to create stress tests against Microsoft SQL Server Analysis Services using the CodePlex open source ASLoadSim tool. The link below will direct you to the ASLoadSim tools and accompanying white paper on its use. Whitepaper - Improve your OLAP Environment with Microsoft and Teradata This paper describes how you can leverage Microsoft SQL Server Analysis Services and Teradata technologies to improve your analytical OLAP application environment, specifically relational (ROLAP) type solution. A059-7A7A732058A2&displaylang=en SQL Server 2005 Books Online Download an updated version of Books Online for Microsoft SQL Server 2005, the primary documentation for SQL Server Microsoft Corporation. All rights reserved Page 25 of 30
26 Appendix B: Analysis Services Stress Testing Toolset Microsoft Visual Studio 2005 Team Edition for Software Testers is an easy-touse yet comprehensive testing technology for web and performance testers. Through the use of the built-in performance testing tools of the software, the following benefits are delivered: Developers and testers can simulate production loads and diagnose performance issues quickly in their testing labs and pre-production environments. Prescriptive guidance increases tester productivity through pre-packaged counters, pre-set thresholds, and built-in knowledge of server behavior. Leverage pre-built views of load statistics and their effect on the servers under test. Enable your entire team to identify bottlenecks early in the software development lifecycle. Flexible load-agent licensing helps more team members get access to testing tools. Through the use of this architecture from the VS.NET 2005 Team Edition Software Tester offering, a new performance test template was created by the Microsoft SQL Server Product Team to track query performance of Analysis Services through a simulated concurrent user workload. The add-in template actually is comprised of two parts, one of which lives outside the VS.NET 2005 Team Test environment. The two parts of this simulation tool are: Analysis Services Query Generator (ASQueryGen for short, and is the portion of the tool that lives outside of Visual Studio), and Analysis Services Load Simulator (ASLoadSim for short). The following two sections describes these two VS.NET 2005 Team Edition tester template tools, how they are configured, and how they were used for this Teradata-SSAS end-user query stress test. Query Generation through ASQueryGen The ASQueryGen tool is a command line program that accepts an XML configuration file (partially shown below), that when executed generates one to how ever many client query files. Each query file will have the same number of MDX queries generated, but if/when the token values are used with the query tag statements, the AS QueryGen tool will replace the token with randomized dimension values as specified by the token, thus creating slightly different MDX queries to run through SSAS. <Queries> <Query>/* 1 : 1 */ WITH MEMBER [Measures].[cPct of Sbj Qty per Vndr] AS '([Measures].[Sale Amt])/([Measures].[Sale Amt], [Item].[Purchase Vendor].[All])' MEMBER [Measures].[cPct of Subj Qty per Curr Vendor] AS '([Measures].[Sale Amt])/([Measures].[Sale Amt], [Item].[Purchase Vendor].currentmember, [Item].[By Dept].[All])' SELECT NON EMPTY { [Measures].[Sale Amt], [Measures].[cPct of Sbj Qty per Vndr], [Measures].[cPct of Subj Qty per Curr Vendor] } ON COLUMNS, NON EMPTY { ([Item].[By Purchase Vendor].[Purchase Vendor].ALLMEMBERS * [Item].[By Dept].[Dept].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]) ) ON COLUMNS FROM ( SELECT ( { [Item].[By Dept].[Dept].&[JUVENILE], [Item].[By Dept].[Dept].&[HARDCOVER], [Item].[By 2008 Microsoft Corporation. All rights reserved Page 26 of 30
27 Dept].[Dept].&[MASS MARKET], [Item].[By Dept].[Dept].&[TRADE PAPERBACK] } ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]), [Time].[Fiscal].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 2 */ SELECT NON EMPTY { [Measures].[cAvg Retail] } ON COLUMNS, NON EMPTY { ([Time].[Calendar].[Calendar Month].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM ( SELECT ( ([Time].[Calendar].[Calendar Year].&[2008]) ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[District].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 3 */ SELECT NON EMPTY { [Measures].[Sales Qty], [Measures].[cAvg Retail] } ON COLUMNS, NON EMPTY { ([Item].[Item].[Item].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Calendar].[Calendar Month].&[6]&[2008]) ) ON COLUMNS FROM ( SELECT ( ([Item].[Category].&[174D]) ) ON COLUMNS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM [REAL Warehouse]))) WHERE ( IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[District].currentmember ), IIF( ([Item].[Category].&[174D]).Count = 1, ([Item].[Category].&[174D]), [Item].[Category].currentmember ), IIF( ([Time].[Calendar].[Calendar Month].&[6]&[2008]).Count = 1, ([Time].[Calendar].[Calendar Month].&[6]&[2008]), [Time].[Calendar].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 4 */ WITH MEMBER [Measures].[cPct of Sbj Qty per Vndr] AS '([Measures].[Sale Amt])/([Measures].[Sale Amt], [Item].[Purchase Vendor].[All])' SELECT NON EMPTY { [Measures].[Sale Amt], [Measures].[cPct of Sbj Qty per Vndr] } ON COLUMNS, NON EMPTY { ([Item].[By Purchase Vendor].[Purchase Vendor].ALLMEMBERS * [Item].[By Dept].[Dept].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]) ) ON COLUMNS FROM ( SELECT ( { [Item].[By Dept].[Dept].&[JUVENILE], [Item].[By Dept].[Dept].&[HARDCOVER], [Item].[By Dept].[Dept].&[MASS MARKET], [Item].[By Dept].[Dept].&[TRADE PAPERBACK] } ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]), [Time].[Fiscal].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 5 */ WITH MEMBER [Measures].[Pct of Qty Vendor] AS '[Measures].[Sales Qty]/([Measures].[Sales Qty], [Item].[Purchase Vendor].[All])' SELECT NON EMPTY { [Measures].[Sale Amt], [Measures].[Pct of Qty Vendor], [Measures].[Sales Qty] } ON COLUMNS, NON EMPTY { ([Item].[By Purchase Vendor].[Purchase Vendor].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]) ) ON COLUMNS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[District].currentmember ), IIF( ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]), [Time].[Fiscal].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 6 */ WITH MEMBER [Measures].[Pct of Qty Vendor] AS '[Measures].[Sales Qty]/([Measures].[Sales Qty], [Item].[All])' SELECT NON EMPTY { [Measures].[Sale Amt], [Measures].[Pct of Qty Vendor], [Measures].[Sales Qty] } ON COLUMNS, NON EMPTY { ([Item].[By Category].[Subject].ALLMEMBERS * [Item].[By Purchase Vendor].[Purchase Vendor].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]) ) ON COLUMNS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[Geography].currentmember ), IIF( ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Week].&[20]&[2008]), [Time].[Fiscal].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 7 */ SELECT NON EMPTY { [Measures].[Sale Amt] } ON COLUMNS, NON EMPTY { ([Item].[By Category].[Subject].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Item].[Dept].&[HARDCOVER]) ) ON COLUMNS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Qtr].&[2]&[2008]) ) ON COLUMNS FROM [REAL Warehouse]))) WHERE ( IIF( ([Time].[Fiscal].[Fiscal 2008 Microsoft Corporation. All rights reserved Page 27 of 30
28 Qtr].&[2]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Qtr].&[2]&[2008]), [Time].[Fiscal].currentmember ), IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[District].currentmember ), IIF( ([Item].[Dept].&[HARDCOVER]).Count = 1, ([Item].[Dept].&[HARDCOVER]), [Item].[Dept].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> <Query>/* 1 : 8 */ SELECT NON EMPTY { [Measures].[Sales Qty] } ON COLUMNS, NON EMPTY { ([Item].[Subject].[Subject].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]) ) ON COLUMNS FROM ( SELECT ( ([Store].[District].&[2.8E1]) ) ON COLUMNS FROM [REAL Warehouse])) WHERE ( IIF( ([Store].[District].&[2.8E1]).Count = 1, ([Store].[District].&[2.8E1]), [Store].[District].currentmember ), IIF( ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]).Count = 1, ([Time].[Fiscal].[Fiscal Period].&[5]&[2008]), [Time].[Fiscal].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS</Query> </Queries> In our creation of the XML configuration file, we set the number of client files to be created at 1000, which corresponds with the maximum number of clients used in a single SSAS database baseline cube query stress run. Client Simulation through ASLoadSim Once the client files are created, a Load Simulation project within Visual Studio 2005 Team Edition can be established. Since the AS LoadSim tool leverages the Software Tester environment, most of the steps to create the load simulation test are driven directly off the new test wizard. For example, the wizard helped to create the test scenario, the load sets (aka number of simultaneous clients), duration of the test, and warm-up period. Depicted below is the loadtest project after the wizard information was completed. Figure 12. VS.NET 2005 Team Edition ASLoadSim Project 2008 Microsoft Corporation. All rights reserved Page 28 of 30
29 One manual alteration to the ASLoadSim project has to be made at this point, and this is to add the specific location for where the project is to pick up its XML configuration file. This XML configuration file for the AS LoadSim project contains important information such as which SSAS database to connect to, which SQL Server RDBMS to store the test results into, what client files to execute, how many of them, what think time mix/maximums should be used, and whether or not to randomize the MDX queries that are executed against the SSAS database. Below is representation of the AS LoadSim XML configuration file which was used. <Configuration> <Server>localhost</Server> <Database>REAL Warehouse Sample TD</Database> <ClientFileName>C:\REAL_TD_ROLAP_TESTS\QueryGenData\Queries\Query.xml</C lientfilename> <ClientFileStartingNumber>1</ClientFileStartingNumber> <NumOfClientFiles>200</NumOfClientFiles> <ThinkTimeMin>2</ThinkTimeMin> <ThinkTimeMax>2</ThinkTimeMax> <DataAccessMethod>Random</DataAccessMethod> <LogToServer>localhost</LogToServer> <LogToDB>LoadTest</LogToDB> </Configuration> Figure 13. XML Configuration File for AS LoadSim Project Stress Test Reports and Counters Used While the stress tests were being run, several key counters were being recorded via Windows System Monitor (Perfmon). These included such counters as processor utilization, total memory consumption, memory usage by Analysis Services, Analysis Services connections, etc. A key part of the ASLoadSim toolset is that all of the characteristics of each stress load, and more importantly, the results of those stress load tests are recorded into a SQL Server 2005 relational database for future analysis and reporting. Included with ASLoadSim are several SQL Server Reporting Services reports that help identify lead test runs, and then key result statistics for specific runs, such as average response time, number of queries executed, and errors (fatal queries/connections) Microsoft Corporation. All rights reserved Page 29 of 30
30 For more information: Microsoft Corporation. All rights reserved Page 30 of 30
Enterprise Performance Tuning: Best Practices with SQL Server 2008 Analysis Services. By Ajay Goyal Consultant Scalability Experts, Inc.
Enterprise Performance Tuning: Best Practices with SQL Server 2008 Analysis Services By Ajay Goyal Consultant Scalability Experts, Inc. June 2009 Recommendations presented in this document should be thoroughly
Scalability. Microsoft Dynamics GP 10.0. Benchmark Performance: Advantages of Microsoft SQL Server 2008 with Compression.
Scalability Microsoft Dynamics GP 10.0 Benchmark Performance: Advantages of Microsoft SQL Server 2008 with Compression White Paper May 2009 Contents Introduction... 3 Summary Results... 3 Benchmark Test
SQL Server Business Intelligence on HP ProLiant DL785 Server
SQL Server Business Intelligence on HP ProLiant DL785 Server By Ajay Goyal www.scalabilityexperts.com Mike Fitzner Hewlett Packard www.hp.com Recommendations presented in this document should be thoroughly
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments July 2014 White Paper Page 1 Contents 3 Sizing Recommendations Summary 3 Workloads used in the tests 3 Transactional
EMC Unified Storage for Microsoft SQL Server 2008
EMC Unified Storage for Microsoft SQL Server 2008 Enabled by EMC CLARiiON and EMC FAST Cache Reference Copyright 2010 EMC Corporation. All rights reserved. Published October, 2010 EMC believes the information
Load Testing Analysis Services Gerhard Brückl
Load Testing Analysis Services Gerhard Brückl About Me Gerhard Brückl Working with Microsoft BI since 2006 Mainly focused on Analytics and Reporting Analysis Services / Reporting Services Power BI / O365
Columnstore Indexes for Fast Data Warehouse Query Processing in SQL Server 11.0
SQL Server Technical Article Columnstore Indexes for Fast Data Warehouse Query Processing in SQL Server 11.0 Writer: Eric N. Hanson Technical Reviewer: Susan Price Published: November 2010 Applies to:
SQL Server 2008 Performance and Scale
SQL Server 2008 Performance and Scale White Paper Published: February 2008 Updated: July 2008 Summary: Microsoft SQL Server 2008 incorporates the tools and technologies that are necessary to implement
Maximum performance, minimal risk for data warehousing
SYSTEM X SERVERS SOLUTION BRIEF Maximum performance, minimal risk for data warehousing Microsoft Data Warehouse Fast Track for SQL Server 2014 on System x3850 X6 (95TB) The rapid growth of technology has
Kronos Workforce Central 6.1 with Microsoft SQL Server: Performance and Scalability for the Enterprise
Kronos Workforce Central 6.1 with Microsoft SQL Server: Performance and Scalability for the Enterprise Providing Enterprise-Class Performance and Scalability and Driving Lower Customer Total Cost of Ownership
Performance data collection and analysis process
Microsoft Dynamics AX 2012 Performance data collection and analysis process White Paper This document outlines the core processes, techniques, and procedures that the Microsoft Dynamics AX product team
Scalability and Performance Report - Analyzer 2007
- Analyzer 2007 Executive Summary Strategy Companion s Analyzer 2007 is enterprise Business Intelligence (BI) software that is designed and engineered to scale to the requirements of large global deployments.
Kronos Workforce Central on VMware Virtual Infrastructure
Kronos Workforce Central on VMware Virtual Infrastructure June 2010 VALIDATION TEST REPORT Legal Notice 2010 VMware, Inc., Kronos Incorporated. All rights reserved. VMware is a registered trademark or
SQL Server 2012 Performance White Paper
Published: April 2012 Applies to: SQL Server 2012 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication.
System Requirements for Microsoft Dynamics NAV 2013 R2
System Requirements for Microsoft Dynamics NAV 2013 R2 February 2014 Contents 3 System Requirements for the Microsoft Dynamics NAV Windows Client 3 Web Client 4 System Requirements for Microsoft Dynamics
Programmabilty. Programmability in Microsoft Dynamics AX 2009. Microsoft Dynamics AX 2009. White Paper
Programmabilty Microsoft Dynamics AX 2009 Programmability in Microsoft Dynamics AX 2009 White Paper December 2008 Contents Introduction... 4 Scenarios... 4 The Presentation Layer... 4 Business Intelligence
System Requirements Table of contents
Table of contents 1 Introduction... 2 2 Knoa Agent... 2 2.1 System Requirements...2 2.2 Environment Requirements...4 3 Knoa Server Architecture...4 3.1 Knoa Server Components... 4 3.2 Server Hardware Setup...5
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for Multitenant Deployments
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for Multitenant Deployments February 2014 Contents Microsoft Dynamics NAV 2013 R2 3 Test deployment configurations 3 Test results 5 Microsoft Dynamics NAV
Microsoft s SQL Server Parallel Data Warehouse Provides High Performance and Great Value
Microsoft s SQL Server Parallel Data Warehouse Provides High Performance and Great Value Published by: Value Prism Consulting Sponsored by: Microsoft Corporation Publish date: March 2013 Abstract: Data
SQL Server 2005 Features Comparison
Page 1 of 10 Quick Links Home Worldwide Search Microsoft.com for: Go : Home Product Information How to Buy Editions Learning Downloads Support Partners Technologies Solutions Community Previous Versions
Dell Virtualization Solution for Microsoft SQL Server 2012 using PowerEdge R820
Dell Virtualization Solution for Microsoft SQL Server 2012 using PowerEdge R820 This white paper discusses the SQL server workload consolidation capabilities of Dell PowerEdge R820 using Virtualization.
How To Test For Performance And Scalability On A Server With A Multi-Core Computer (For A Large Server)
Scalability Results Select the right hardware configuration for your organization to optimize performance Table of Contents Introduction... 1 Scalability... 2 Definition... 2 CPU and Memory Usage... 2
Server Consolidation with SQL Server 2008
Server Consolidation with SQL Server 2008 White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 supports multiple options for server consolidation, providing organizations
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010 Better Together Writer: Bill Baer, Technical Product Manager, SharePoint Product Group Technical Reviewers: Steve Peschka,
Dell Microsoft Business Intelligence and Data Warehousing Reference Configuration Performance Results Phase III
White Paper Dell Microsoft Business Intelligence and Data Warehousing Reference Configuration Performance Results Phase III Performance of Microsoft SQL Server 2008 BI and D/W Solutions on Dell PowerEdge
DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering
DELL Virtual Desktop Infrastructure Study END-TO-END COMPUTING Dell Enterprise Solutions Engineering 1 THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL ERRORS AND TECHNICAL
Online Transaction Processing in SQL Server 2008
Online Transaction Processing in SQL Server 2008 White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 provides a database platform that is optimized for today s applications,
Introduction. Part I: Finding Bottlenecks when Something s Wrong. Chapter 1: Performance Tuning 3
Wort ftoc.tex V3-12/17/2007 2:00pm Page ix Introduction xix Part I: Finding Bottlenecks when Something s Wrong Chapter 1: Performance Tuning 3 Art or Science? 3 The Science of Performance Tuning 4 The
BizTalk Server 2006. Business Activity Monitoring. Microsoft Corporation Published: April 2005. Abstract
BizTalk Server 2006 Business Activity Monitoring Microsoft Corporation Published: April 2005 Abstract This paper provides a detailed description of two new Business Activity Monitoring (BAM) features in
Writers: Joanne Hodgins, Omri Bahat, Morgan Oslake, and Matt Hollingsworth
SQL Server Technical Article Writers: Joanne Hodgins, Omri Bahat, Morgan Oslake, and Matt Hollingsworth Technical Reviewer: Dan Jones Published: August 2009 Applies to: SQL Server 2008 R2, August CTP Summary:
Aras Innovator 10 Scalability Benchmark Methodology and Performance Results
Aras Innovator 10 Scalability Benchmark Methodology and Performance Results Aras Innovator 10 Running on SQL Server 2012 Enterprise Edition Contents Executive Summary... 1 Introduction... 2 About Aras...
Big Data Analytics with IBM Cognos BI Dynamic Query IBM Redbooks Solution Guide
Big Data Analytics with IBM Cognos BI Dynamic Query IBM Redbooks Solution Guide IBM Cognos Business Intelligence (BI) helps you make better and smarter business decisions faster. Advanced visualization
East Asia Network Sdn Bhd
Course: Analyzing, Designing, and Implementing a Data Warehouse with Microsoft SQL Server 2014 Elements of this syllabus may be change to cater to the participants background & knowledge. This course describes
SQL Server Administrator Introduction - 3 Days Objectives
SQL Server Administrator Introduction - 3 Days INTRODUCTION TO MICROSOFT SQL SERVER Exploring the components of SQL Server Identifying SQL Server administration tasks INSTALLING SQL SERVER Identifying
SQL Server Analysis Services Complete Practical & Real-time Training
A Unit of Sequelgate Innovative Technologies Pvt. Ltd. ISO Certified Training Institute Microsoft Certified Partner SQL Server Analysis Services Complete Practical & Real-time Training Mode: Practical,
Connectivity Pack for Microsoft Guide
HP Vertica Analytic Database Software Version: 7.0.x Document Release Date: 2/20/2015 Legal Notices Warranty The only warranties for HP products and services are set forth in the express warranty statements
Hands-On Lab: WSUS. Lab Manual Expediting WSUS Service for XP Embedded OS
Lab Manual Expediting WSUS Service for XP Embedded OS Summary In this lab, you will learn how to deploy the security update to your XP Pro or XP embedded images. You will also learn how to prepare the
Virtualizing SQL Server 2008 Using EMC VNX Series and Microsoft Windows Server 2008 R2 Hyper-V. Reference Architecture
Virtualizing SQL Server 2008 Using EMC VNX Series and Microsoft Windows Server 2008 R2 Hyper-V Copyright 2011 EMC Corporation. All rights reserved. Published February, 2011 EMC believes the information
DELL s Oracle Database Advisor
DELL s Oracle Database Advisor Underlying Methodology A Dell Technical White Paper Database Solutions Engineering By Roger Lopez Phani MV Dell Product Group January 2010 THIS WHITE PAPER IS FOR INFORMATIONAL
Beyond Plateaux: Optimize SSAS via Best Practices
Beyond Plateaux: Optimize SSAS via Best Practices Bill Pearson Island Technologies Inc. [email protected] @Bill_Pearson Beyond Plateaux: Optimize SSAS via Best Practices Introduction and Overview
Database Server Configuration Best Practices for Aras Innovator 10
Database Server Configuration Best Practices for Aras Innovator 10 Aras Innovator 10 Running on SQL Server 2012 Enterprise Edition Contents Executive Summary... 1 Introduction... 2 Overview... 2 Aras Innovator
The Methodology Behind the Dell SQL Server Advisor Tool
The Methodology Behind the Dell SQL Server Advisor Tool Database Solutions Engineering By Phani MV Dell Product Group October 2009 Executive Summary The Dell SQL Server Advisor is intended to perform capacity
Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays
Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays Database Solutions Engineering By Murali Krishnan.K Dell Product Group October 2009
Capacity Planning for Microsoft SharePoint Technologies
Capacity Planning for Microsoft SharePoint Technologies Capacity Planning The process of evaluating a technology against the needs of an organization, and making an educated decision about the configuration
PERFORMANCE AND SCALABILITY
PERFORMANCE AND SCALABILITY User Scalability for the Enterprise Microsoft Dynamics CRM 4.0 February 2008 CONTENTS EXECUTIVE SUMMARY... 1 RESULTS SUMMARY... 1 OVERVIEW... 2 TESTING METHODOLOGY... 2 BUSINESS
The MAX5 Advantage: Clients Benefit running Microsoft SQL Server Data Warehouse (Workloads) on IBM BladeCenter HX5 with IBM MAX5.
Performance benefit of MAX5 for databases The MAX5 Advantage: Clients Benefit running Microsoft SQL Server Data Warehouse (Workloads) on IBM BladeCenter HX5 with IBM MAX5 Vinay Kulkarni Kent Swalin IBM
Turning your Warehouse Data into Business Intelligence: Reporting Trends and Visibility Michael Armanious; Vice President Sales and Marketing Datex,
Turning your Warehouse Data into Business Intelligence: Reporting Trends and Visibility Michael Armanious; Vice President Sales and Marketing Datex, Inc. Overview Introduction What is Business Intelligence?
HP reference configuration for entry-level SAS Grid Manager solutions
HP reference configuration for entry-level SAS Grid Manager solutions Up to 864 simultaneous SAS jobs and more than 3 GB/s I/O throughput Technical white paper Table of contents Executive summary... 2
Reporting Services. White Paper. Published: August 2007 Updated: July 2008
Reporting Services White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 Reporting Services provides a complete server-based platform that is designed to support a wide
HP SN1000E 16 Gb Fibre Channel HBA Evaluation
HP SN1000E 16 Gb Fibre Channel HBA Evaluation Evaluation report prepared under contract with Emulex Executive Summary The computing industry is experiencing an increasing demand for storage performance
Windows Server 2008 R2 Hyper-V Live Migration
Windows Server 2008 R2 Hyper-V Live Migration White Paper Published: August 09 This is a preliminary document and may be changed substantially prior to final commercial release of the software described
Legal Notices... 2. Introduction... 3
HP Asset Manager Asset Manager 5.10 Sizing Guide Using the Oracle Database Server, or IBM DB2 Database Server, or Microsoft SQL Server Legal Notices... 2 Introduction... 3 Asset Manager Architecture...
Azure Scalability Prescriptive Architecture using the Enzo Multitenant Framework
Azure Scalability Prescriptive Architecture using the Enzo Multitenant Framework Many corporations and Independent Software Vendors considering cloud computing adoption face a similar challenge: how should
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box)
SQL Server 2012 Gives You More Advanced Features (Out-Of-The-Box) SQL Server White Paper Published: January 2012 Applies to: SQL Server 2012 Summary: This paper explains the different ways in which databases
ORACLE OLAP. Oracle OLAP is embedded in the Oracle Database kernel and runs in the same database process
ORACLE OLAP KEY FEATURES AND BENEFITS FAST ANSWERS TO TOUGH QUESTIONS EASILY KEY FEATURES & BENEFITS World class analytic engine Superior query performance Simple SQL access to advanced analytics Enhanced
Enterprise Planning Small Scale
ARGUS Enterprise Enterprise Planning Small Scale ARGUS Enterprise 11.0 12\4\2015 ARGUS Software An Altus Group Company Small Enterprise Planning ARGUS Enterprise 11.0 Published by: ARGUS Software, Inc.
Introducing Oracle Exalytics In-Memory Machine
Introducing Oracle Exalytics In-Memory Machine Jon Ainsworth Director of Business Development Oracle EMEA Business Analytics 1 Copyright 2011, Oracle and/or its affiliates. All rights Agenda Topics Oracle
System Requirements Version 8.0 July 25, 2013
System Requirements Version 8.0 July 25, 2013 For the most recent version of this document, visit our documentation website. Table of Contents 1 System requirements 3 2 Scalable infrastructure example
Dell One Identity Manager Scalability and Performance
Dell One Identity Manager Scalability and Performance Scale up and out to ensure simple, effective governance for users. Abstract For years, organizations have had to be able to support user communities
Foglight. Managing Hyper-V Systems User and Reference Guide
Foglight Managing Hyper-V Systems User and Reference Guide 2014 Quest Software, Inc. ALL RIGHTS RESERVED. This guide contains proprietary information protected by copyright. The software described in this
Windows Server 2012 2,500-user pooled VDI deployment guide
Windows Server 2012 2,500-user pooled VDI deployment guide Microsoft Corporation Published: August 2013 Abstract Microsoft Virtual Desktop Infrastructure (VDI) is a centralized desktop delivery solution
HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief
Technical white paper HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief Scale-up your Microsoft SQL Server environment to new heights Table of contents Executive summary... 2 Introduction...
ProSystem fx Engagement. Deployment Planning Guide
ProSystem fx Engagement Deployment Planning Guide September 2011 Copyright: 2011, CCH, a Wolters Kluwer business. All rights reserved. Material in this publication may not be reproduced or transmitted
Oracle Database Scalability in VMware ESX VMware ESX 3.5
Performance Study Oracle Database Scalability in VMware ESX VMware ESX 3.5 Database applications running on individual physical servers represent a large consolidation opportunity. However enterprises
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence
Intel Cloud Builder Guide: Cloud Design and Deployment on Intel Platforms
EXECUTIVE SUMMARY Intel Cloud Builder Guide Intel Xeon Processor-based Servers Red Hat* Cloud Foundations Intel Cloud Builder Guide: Cloud Design and Deployment on Intel Platforms Red Hat* Cloud Foundations
Windows Embedded Security and Surveillance Solutions
Windows Embedded Security and Surveillance Solutions Windows Embedded 2010 Page 1 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues
Condusiv s V-locity Server Boosts Performance of SQL Server 2012 by 55%
openbench Labs Executive Briefing: April 19, 2013 Condusiv s Server Boosts Performance of SQL Server 2012 by 55% Optimizing I/O for Increased Throughput and Reduced Latency on Physical Servers 01 Executive
Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5 Days
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5
SQL SERVER BUSINESS INTELLIGENCE (BI) - INTRODUCTION
1 SQL SERVER BUSINESS INTELLIGENCE (BI) - INTRODUCTION What is BI? Microsoft SQL Server 2008 provides a scalable Business Intelligence platform optimized for data integration, reporting, and analysis,
Key Benefits of Microsoft Visual Studio Team System
of Microsoft Visual Studio Team System White Paper November 2007 For the latest information, please see www.microsoft.com/vstudio The information contained in this document represents the current view
SQL Server 2012 Business Intelligence Boot Camp
SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations
An Oracle White Paper Released Sept 2008
Performance and Scalability Benchmark: Siebel CRM Release 8.0 Industry Applications on HP BL460c/BL680c Servers running Microsoft Windows Server 2008 Enterprise Edition and SQL Server 2008 (x64) An Oracle
The IBM Cognos Platform
The IBM Cognos Platform Deliver complete, consistent, timely information to all your users, with cost-effective scale Highlights Reach all your information reliably and quickly Deliver a complete, consistent
Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays. Red Hat Performance Engineering
Removing Performance Bottlenecks in Databases with Red Hat Enterprise Linux and Violin Memory Flash Storage Arrays Red Hat Performance Engineering Version 1.0 August 2013 1801 Varsity Drive Raleigh NC
CHAPTER 5: BUSINESS ANALYTICS
Chapter 5: Business Analytics CHAPTER 5: BUSINESS ANALYTICS Objectives The objectives are: Describe Business Analytics. Explain the terminology associated with Business Analytics. Describe the data warehouse
Enterprise Deployment: Laserfiche 8 in a Virtual Environment. White Paper
Enterprise Deployment: Laserfiche 8 in a Virtual Environment White Paper August 2008 The information contained in this document represents the current view of Compulink Management Center, Inc on the issues
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations. Database Solutions Engineering
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations A Dell Technical White Paper Database Solutions Engineering By Sudhansu Sekhar and Raghunatha
Performance characterization report for Microsoft Hyper-V R2 on HP StorageWorks P4500 SAN storage
Performance characterization report for Microsoft Hyper-V R2 on HP StorageWorks P4500 SAN storage Technical white paper Table of contents Executive summary... 2 Introduction... 2 Test methodology... 3
A Microsoft U.S. Public Sector White Paper by Ken Page and Shelly Bird. January 2009. www.microsoft.com/ government
Federal Server Core Configuration (FSCC) A high-level overview of the value and benefits of deploying a single, standard, enterprise-wide managed server environment A Microsoft U.S. Public Sector White
SQL Server 2012 Parallel Data Warehouse. Solution Brief
SQL Server 2012 Parallel Data Warehouse Solution Brief Published February 22, 2013 Contents Introduction... 1 Microsoft Platform: Windows Server and SQL Server... 2 SQL Server 2012 Parallel Data Warehouse...
Whitepaper. Innovations in Business Intelligence Database Technology. www.sisense.com
Whitepaper Innovations in Business Intelligence Database Technology The State of Database Technology in 2015 Database technology has seen rapid developments in the past two decades. Online Analytical Processing
Licenze Microsoft SQL Server 2005
Versione software Licenze Microsoft SQL Server 2005 Noleggio/mese senza assistenza sistemistica Noleggio/mese CON assistenza sistemistica SQL Server Express 0,00+Iva da preventivare SQL Server Workgroup
SharePlex for SQL Server
SharePlex for SQL Server Improving analytics and reporting with near real-time data replication Written by Susan Wong, principal solutions architect, Dell Software Abstract Many organizations today rely
Deployment Planning Guide
Deployment Planning Guide August 2011 Copyright: 2011, CCH, a Wolters Kluwer business. All rights reserved. Material in this publication may not be reproduced or transmitted in any form or by any means,
StreamServe Persuasion SP5 Microsoft SQL Server
StreamServe Persuasion SP5 Microsoft SQL Server Database Guidelines Rev A StreamServe Persuasion SP5 Microsoft SQL Server Database Guidelines Rev A 2001-2011 STREAMSERVE, INC. ALL RIGHTS RESERVED United
Getting started with Microsoft SharePoint Server 2010
Getting started with Microsoft SharePoint Server 2010 Microsoft Corporation Published: May 2010 Author: Microsoft Office System and Servers Team ([email protected]) Abstract This book provides basic
VDI FIT and VDI UX: Composite Metrics Track Good, Fair, Poor Desktop Performance
VDI FIT and VDI UX: Composite Metrics Track Good, Fair, Poor Desktop Performance Key indicators and classification capabilities in Stratusphere FIT and Stratusphere UX Whitepaper INTRODUCTION This whitepaper
Technical Specifications
Technical Specifications Deployment and Integration The zero footprint web architecture ensures no intrusion on your users computers. Use ZAP CubeXpress to bring in Microsoft Dynamics customizations and
Avoiding Common Analysis Services Mistakes. Craig Utley
Avoiding Common Analysis Services Mistakes Craig Utley Who Am I? Craig Utley, Mentor with Solid Quality Mentors [email protected] Consultant specializing in development with Microsoft technologies and data
PERFORMANCE AND SCALABILITY
PERFORMANCE AND SCALABILITY User Scalability for the Enterprise Microsoft Dynamics CRM 4.0 February 2008 CONTENTS EXECUTIVE SUMMARY... 1 RESULTS SUMMARY... 1 OVERVIEW... 2 TESTING METHODOLOGY... 2 BUSINESS
Microsoft Office SharePoint Server 2007 Performance on VMware vsphere 4.1
Performance Study Microsoft Office SharePoint Server 2007 Performance on VMware vsphere 4.1 VMware vsphere 4.1 One of the key benefits of virtualization is the ability to consolidate multiple applications
Creating BI solutions with BISM Tabular. Written By: Dan Clark
Creating BI solutions with BISM Tabular Written By: Dan Clark CONTENTS PAGE 3 INTRODUCTION PAGE 4 PAGE 5 PAGE 7 PAGE 8 PAGE 9 PAGE 9 PAGE 11 PAGE 12 PAGE 13 PAGE 14 PAGE 17 SSAS TABULAR MODE TABULAR MODELING
Virtualization with Microsoft Windows Server 2003 R2, Enterprise Edition
Virtualization with Microsoft Windows Server 2003 R2, Enterprise Edition Microsoft Corporation Published: March 2006 Abstract Virtualization in the volume server market is starting to see rapid adoption
